Nov 28 18:54:28 np0005539279 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 28 18:54:28 np0005539279 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 28 18:54:28 np0005539279 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 18:54:28 np0005539279 kernel: BIOS-provided physical RAM map:
Nov 28 18:54:28 np0005539279 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 28 18:54:28 np0005539279 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 28 18:54:28 np0005539279 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 28 18:54:28 np0005539279 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 28 18:54:28 np0005539279 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 28 18:54:28 np0005539279 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 28 18:54:28 np0005539279 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 28 18:54:28 np0005539279 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 28 18:54:28 np0005539279 kernel: NX (Execute Disable) protection: active
Nov 28 18:54:28 np0005539279 kernel: APIC: Static calls initialized
Nov 28 18:54:28 np0005539279 kernel: SMBIOS 2.8 present.
Nov 28 18:54:28 np0005539279 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 28 18:54:28 np0005539279 kernel: Hypervisor detected: KVM
Nov 28 18:54:28 np0005539279 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 28 18:54:28 np0005539279 kernel: kvm-clock: using sched offset of 3353730951 cycles
Nov 28 18:54:28 np0005539279 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 28 18:54:28 np0005539279 kernel: tsc: Detected 2800.000 MHz processor
Nov 28 18:54:28 np0005539279 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 28 18:54:28 np0005539279 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 28 18:54:28 np0005539279 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 28 18:54:28 np0005539279 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 28 18:54:28 np0005539279 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 28 18:54:28 np0005539279 kernel: Using GB pages for direct mapping
Nov 28 18:54:28 np0005539279 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 28 18:54:28 np0005539279 kernel: ACPI: Early table checksum verification disabled
Nov 28 18:54:28 np0005539279 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 28 18:54:28 np0005539279 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 18:54:28 np0005539279 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 18:54:28 np0005539279 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 18:54:28 np0005539279 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 28 18:54:28 np0005539279 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 18:54:28 np0005539279 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 18:54:28 np0005539279 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 28 18:54:28 np0005539279 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 28 18:54:28 np0005539279 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 28 18:54:28 np0005539279 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 28 18:54:28 np0005539279 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 28 18:54:28 np0005539279 kernel: No NUMA configuration found
Nov 28 18:54:28 np0005539279 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 28 18:54:28 np0005539279 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 28 18:54:28 np0005539279 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 28 18:54:28 np0005539279 kernel: Zone ranges:
Nov 28 18:54:28 np0005539279 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 28 18:54:28 np0005539279 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 28 18:54:28 np0005539279 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 28 18:54:28 np0005539279 kernel:  Device   empty
Nov 28 18:54:28 np0005539279 kernel: Movable zone start for each node
Nov 28 18:54:28 np0005539279 kernel: Early memory node ranges
Nov 28 18:54:28 np0005539279 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 28 18:54:28 np0005539279 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 28 18:54:28 np0005539279 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 28 18:54:28 np0005539279 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 28 18:54:28 np0005539279 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 28 18:54:28 np0005539279 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 28 18:54:28 np0005539279 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 28 18:54:28 np0005539279 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 28 18:54:28 np0005539279 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 28 18:54:28 np0005539279 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 28 18:54:28 np0005539279 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 28 18:54:28 np0005539279 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 28 18:54:28 np0005539279 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 28 18:54:28 np0005539279 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 28 18:54:28 np0005539279 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 28 18:54:28 np0005539279 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 28 18:54:28 np0005539279 kernel: TSC deadline timer available
Nov 28 18:54:28 np0005539279 kernel: CPU topo: Max. logical packages:   8
Nov 28 18:54:28 np0005539279 kernel: CPU topo: Max. logical dies:       8
Nov 28 18:54:28 np0005539279 kernel: CPU topo: Max. dies per package:   1
Nov 28 18:54:28 np0005539279 kernel: CPU topo: Max. threads per core:   1
Nov 28 18:54:28 np0005539279 kernel: CPU topo: Num. cores per package:     1
Nov 28 18:54:28 np0005539279 kernel: CPU topo: Num. threads per package:   1
Nov 28 18:54:28 np0005539279 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 28 18:54:28 np0005539279 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 28 18:54:28 np0005539279 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 28 18:54:28 np0005539279 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 28 18:54:28 np0005539279 kernel: Booting paravirtualized kernel on KVM
Nov 28 18:54:28 np0005539279 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 28 18:54:28 np0005539279 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 28 18:54:28 np0005539279 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 28 18:54:28 np0005539279 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 28 18:54:28 np0005539279 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 18:54:28 np0005539279 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 28 18:54:28 np0005539279 kernel: random: crng init done
Nov 28 18:54:28 np0005539279 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: Fallback order for Node 0: 0 
Nov 28 18:54:28 np0005539279 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 28 18:54:28 np0005539279 kernel: Policy zone: Normal
Nov 28 18:54:28 np0005539279 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 28 18:54:28 np0005539279 kernel: software IO TLB: area num 8.
Nov 28 18:54:28 np0005539279 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 28 18:54:28 np0005539279 kernel: ftrace: allocating 49313 entries in 193 pages
Nov 28 18:54:28 np0005539279 kernel: ftrace: allocated 193 pages with 3 groups
Nov 28 18:54:28 np0005539279 kernel: Dynamic Preempt: voluntary
Nov 28 18:54:28 np0005539279 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 28 18:54:28 np0005539279 kernel: rcu: #011RCU event tracing is enabled.
Nov 28 18:54:28 np0005539279 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 28 18:54:28 np0005539279 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 28 18:54:28 np0005539279 kernel: #011Rude variant of Tasks RCU enabled.
Nov 28 18:54:28 np0005539279 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 28 18:54:28 np0005539279 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 28 18:54:28 np0005539279 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 28 18:54:28 np0005539279 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 18:54:28 np0005539279 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 18:54:28 np0005539279 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 18:54:28 np0005539279 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 28 18:54:28 np0005539279 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 28 18:54:28 np0005539279 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 28 18:54:28 np0005539279 kernel: Console: colour VGA+ 80x25
Nov 28 18:54:28 np0005539279 kernel: printk: console [ttyS0] enabled
Nov 28 18:54:28 np0005539279 kernel: ACPI: Core revision 20230331
Nov 28 18:54:28 np0005539279 kernel: APIC: Switch to symmetric I/O mode setup
Nov 28 18:54:28 np0005539279 kernel: x2apic enabled
Nov 28 18:54:28 np0005539279 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 28 18:54:28 np0005539279 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 28 18:54:28 np0005539279 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Nov 28 18:54:28 np0005539279 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 28 18:54:28 np0005539279 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 28 18:54:28 np0005539279 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 28 18:54:28 np0005539279 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 28 18:54:28 np0005539279 kernel: Spectre V2 : Mitigation: Retpolines
Nov 28 18:54:28 np0005539279 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 28 18:54:28 np0005539279 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 28 18:54:28 np0005539279 kernel: RETBleed: Mitigation: untrained return thunk
Nov 28 18:54:28 np0005539279 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 28 18:54:28 np0005539279 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 28 18:54:28 np0005539279 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 28 18:54:28 np0005539279 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 28 18:54:28 np0005539279 kernel: x86/bugs: return thunk changed
Nov 28 18:54:28 np0005539279 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 28 18:54:28 np0005539279 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 28 18:54:28 np0005539279 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 28 18:54:28 np0005539279 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 28 18:54:28 np0005539279 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 28 18:54:28 np0005539279 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 28 18:54:28 np0005539279 kernel: Freeing SMP alternatives memory: 40K
Nov 28 18:54:28 np0005539279 kernel: pid_max: default: 32768 minimum: 301
Nov 28 18:54:28 np0005539279 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 28 18:54:28 np0005539279 kernel: landlock: Up and running.
Nov 28 18:54:28 np0005539279 kernel: Yama: becoming mindful.
Nov 28 18:54:28 np0005539279 kernel: SELinux:  Initializing.
Nov 28 18:54:28 np0005539279 kernel: LSM support for eBPF active
Nov 28 18:54:28 np0005539279 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 28 18:54:28 np0005539279 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 28 18:54:28 np0005539279 kernel: ... version:                0
Nov 28 18:54:28 np0005539279 kernel: ... bit width:              48
Nov 28 18:54:28 np0005539279 kernel: ... generic registers:      6
Nov 28 18:54:28 np0005539279 kernel: ... value mask:             0000ffffffffffff
Nov 28 18:54:28 np0005539279 kernel: ... max period:             00007fffffffffff
Nov 28 18:54:28 np0005539279 kernel: ... fixed-purpose events:   0
Nov 28 18:54:28 np0005539279 kernel: ... event mask:             000000000000003f
Nov 28 18:54:28 np0005539279 kernel: signal: max sigframe size: 1776
Nov 28 18:54:28 np0005539279 kernel: rcu: Hierarchical SRCU implementation.
Nov 28 18:54:28 np0005539279 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 28 18:54:28 np0005539279 kernel: smp: Bringing up secondary CPUs ...
Nov 28 18:54:28 np0005539279 kernel: smpboot: x86: Booting SMP configuration:
Nov 28 18:54:28 np0005539279 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 28 18:54:28 np0005539279 kernel: smp: Brought up 1 node, 8 CPUs
Nov 28 18:54:28 np0005539279 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Nov 28 18:54:28 np0005539279 kernel: node 0 deferred pages initialised in 12ms
Nov 28 18:54:28 np0005539279 kernel: Memory: 7765856K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616272K reserved, 0K cma-reserved)
Nov 28 18:54:28 np0005539279 kernel: devtmpfs: initialized
Nov 28 18:54:28 np0005539279 kernel: x86/mm: Memory block size: 128MB
Nov 28 18:54:28 np0005539279 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 28 18:54:28 np0005539279 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: pinctrl core: initialized pinctrl subsystem
Nov 28 18:54:28 np0005539279 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 28 18:54:28 np0005539279 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 28 18:54:28 np0005539279 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 28 18:54:28 np0005539279 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 28 18:54:28 np0005539279 kernel: audit: initializing netlink subsys (disabled)
Nov 28 18:54:28 np0005539279 kernel: audit: type=2000 audit(1764374065.936:1): state=initialized audit_enabled=0 res=1
Nov 28 18:54:28 np0005539279 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 28 18:54:28 np0005539279 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 28 18:54:28 np0005539279 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 28 18:54:28 np0005539279 kernel: cpuidle: using governor menu
Nov 28 18:54:28 np0005539279 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 28 18:54:28 np0005539279 kernel: PCI: Using configuration type 1 for base access
Nov 28 18:54:28 np0005539279 kernel: PCI: Using configuration type 1 for extended access
Nov 28 18:54:28 np0005539279 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 28 18:54:28 np0005539279 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 28 18:54:28 np0005539279 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 28 18:54:28 np0005539279 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 28 18:54:28 np0005539279 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 28 18:54:28 np0005539279 kernel: Demotion targets for Node 0: null
Nov 28 18:54:28 np0005539279 kernel: cryptd: max_cpu_qlen set to 1000
Nov 28 18:54:28 np0005539279 kernel: ACPI: Added _OSI(Module Device)
Nov 28 18:54:28 np0005539279 kernel: ACPI: Added _OSI(Processor Device)
Nov 28 18:54:28 np0005539279 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 28 18:54:28 np0005539279 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 28 18:54:28 np0005539279 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 28 18:54:28 np0005539279 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 28 18:54:28 np0005539279 kernel: ACPI: Interpreter enabled
Nov 28 18:54:28 np0005539279 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 28 18:54:28 np0005539279 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 28 18:54:28 np0005539279 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 28 18:54:28 np0005539279 kernel: PCI: Using E820 reservations for host bridge windows
Nov 28 18:54:28 np0005539279 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 28 18:54:28 np0005539279 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 28 18:54:28 np0005539279 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [3] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [4] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [5] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [6] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [7] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [8] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [9] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [10] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [11] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [12] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [13] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [14] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [15] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [16] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [17] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [18] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [19] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [20] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [21] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [22] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [23] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [24] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [25] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [26] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [27] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [28] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [29] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [30] registered
Nov 28 18:54:28 np0005539279 kernel: acpiphp: Slot [31] registered
Nov 28 18:54:28 np0005539279 kernel: PCI host bridge to bus 0000:00
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 28 18:54:28 np0005539279 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 28 18:54:28 np0005539279 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 28 18:54:28 np0005539279 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 28 18:54:28 np0005539279 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 28 18:54:28 np0005539279 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 28 18:54:28 np0005539279 kernel: iommu: Default domain type: Translated
Nov 28 18:54:28 np0005539279 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 28 18:54:28 np0005539279 kernel: SCSI subsystem initialized
Nov 28 18:54:28 np0005539279 kernel: ACPI: bus type USB registered
Nov 28 18:54:28 np0005539279 kernel: usbcore: registered new interface driver usbfs
Nov 28 18:54:28 np0005539279 kernel: usbcore: registered new interface driver hub
Nov 28 18:54:28 np0005539279 kernel: usbcore: registered new device driver usb
Nov 28 18:54:28 np0005539279 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 28 18:54:28 np0005539279 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 28 18:54:28 np0005539279 kernel: PTP clock support registered
Nov 28 18:54:28 np0005539279 kernel: EDAC MC: Ver: 3.0.0
Nov 28 18:54:28 np0005539279 kernel: NetLabel: Initializing
Nov 28 18:54:28 np0005539279 kernel: NetLabel:  domain hash size = 128
Nov 28 18:54:28 np0005539279 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 28 18:54:28 np0005539279 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 28 18:54:28 np0005539279 kernel: PCI: Using ACPI for IRQ routing
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 28 18:54:28 np0005539279 kernel: vgaarb: loaded
Nov 28 18:54:28 np0005539279 kernel: clocksource: Switched to clocksource kvm-clock
Nov 28 18:54:28 np0005539279 kernel: VFS: Disk quotas dquot_6.6.0
Nov 28 18:54:28 np0005539279 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 28 18:54:28 np0005539279 kernel: pnp: PnP ACPI init
Nov 28 18:54:28 np0005539279 kernel: pnp: PnP ACPI: found 5 devices
Nov 28 18:54:28 np0005539279 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 28 18:54:28 np0005539279 kernel: NET: Registered PF_INET protocol family
Nov 28 18:54:28 np0005539279 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 28 18:54:28 np0005539279 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 28 18:54:28 np0005539279 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 28 18:54:28 np0005539279 kernel: NET: Registered PF_XDP protocol family
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 28 18:54:28 np0005539279 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 28 18:54:28 np0005539279 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 28 18:54:28 np0005539279 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 86946 usecs
Nov 28 18:54:28 np0005539279 kernel: PCI: CLS 0 bytes, default 64
Nov 28 18:54:28 np0005539279 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 28 18:54:28 np0005539279 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 28 18:54:28 np0005539279 kernel: ACPI: bus type thunderbolt registered
Nov 28 18:54:28 np0005539279 kernel: Trying to unpack rootfs image as initramfs...
Nov 28 18:54:28 np0005539279 kernel: Initialise system trusted keyrings
Nov 28 18:54:28 np0005539279 kernel: Key type blacklist registered
Nov 28 18:54:28 np0005539279 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 28 18:54:28 np0005539279 kernel: zbud: loaded
Nov 28 18:54:28 np0005539279 kernel: integrity: Platform Keyring initialized
Nov 28 18:54:28 np0005539279 kernel: integrity: Machine keyring initialized
Nov 28 18:54:28 np0005539279 kernel: Freeing initrd memory: 85868K
Nov 28 18:54:28 np0005539279 kernel: NET: Registered PF_ALG protocol family
Nov 28 18:54:28 np0005539279 kernel: xor: automatically using best checksumming function   avx       
Nov 28 18:54:28 np0005539279 kernel: Key type asymmetric registered
Nov 28 18:54:28 np0005539279 kernel: Asymmetric key parser 'x509' registered
Nov 28 18:54:28 np0005539279 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 28 18:54:28 np0005539279 kernel: io scheduler mq-deadline registered
Nov 28 18:54:28 np0005539279 kernel: io scheduler kyber registered
Nov 28 18:54:28 np0005539279 kernel: io scheduler bfq registered
Nov 28 18:54:28 np0005539279 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 28 18:54:28 np0005539279 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 28 18:54:28 np0005539279 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 28 18:54:28 np0005539279 kernel: ACPI: button: Power Button [PWRF]
Nov 28 18:54:28 np0005539279 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 28 18:54:28 np0005539279 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 28 18:54:28 np0005539279 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 28 18:54:28 np0005539279 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 28 18:54:28 np0005539279 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 28 18:54:28 np0005539279 kernel: Non-volatile memory driver v1.3
Nov 28 18:54:28 np0005539279 kernel: rdac: device handler registered
Nov 28 18:54:28 np0005539279 kernel: hp_sw: device handler registered
Nov 28 18:54:28 np0005539279 kernel: emc: device handler registered
Nov 28 18:54:28 np0005539279 kernel: alua: device handler registered
Nov 28 18:54:28 np0005539279 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 28 18:54:28 np0005539279 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 28 18:54:28 np0005539279 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 28 18:54:28 np0005539279 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 28 18:54:28 np0005539279 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 28 18:54:28 np0005539279 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 28 18:54:28 np0005539279 kernel: usb usb1: Product: UHCI Host Controller
Nov 28 18:54:28 np0005539279 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 28 18:54:28 np0005539279 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 28 18:54:28 np0005539279 kernel: hub 1-0:1.0: USB hub found
Nov 28 18:54:28 np0005539279 kernel: hub 1-0:1.0: 2 ports detected
Nov 28 18:54:28 np0005539279 kernel: usbcore: registered new interface driver usbserial_generic
Nov 28 18:54:28 np0005539279 kernel: usbserial: USB Serial support registered for generic
Nov 28 18:54:28 np0005539279 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 28 18:54:28 np0005539279 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 28 18:54:28 np0005539279 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 28 18:54:28 np0005539279 kernel: mousedev: PS/2 mouse device common for all mice
Nov 28 18:54:28 np0005539279 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 28 18:54:28 np0005539279 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 28 18:54:28 np0005539279 kernel: rtc_cmos 00:04: registered as rtc0
Nov 28 18:54:28 np0005539279 kernel: rtc_cmos 00:04: setting system clock to 2025-11-28T23:54:27 UTC (1764374067)
Nov 28 18:54:28 np0005539279 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 28 18:54:28 np0005539279 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 28 18:54:28 np0005539279 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 28 18:54:28 np0005539279 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 28 18:54:28 np0005539279 kernel: usbcore: registered new interface driver usbhid
Nov 28 18:54:28 np0005539279 kernel: usbhid: USB HID core driver
Nov 28 18:54:28 np0005539279 kernel: drop_monitor: Initializing network drop monitor service
Nov 28 18:54:28 np0005539279 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 28 18:54:28 np0005539279 kernel: Initializing XFRM netlink socket
Nov 28 18:54:28 np0005539279 kernel: NET: Registered PF_INET6 protocol family
Nov 28 18:54:28 np0005539279 kernel: Segment Routing with IPv6
Nov 28 18:54:28 np0005539279 kernel: NET: Registered PF_PACKET protocol family
Nov 28 18:54:28 np0005539279 kernel: mpls_gso: MPLS GSO support
Nov 28 18:54:28 np0005539279 kernel: IPI shorthand broadcast: enabled
Nov 28 18:54:28 np0005539279 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 28 18:54:28 np0005539279 kernel: AES CTR mode by8 optimization enabled
Nov 28 18:54:28 np0005539279 kernel: sched_clock: Marking stable (1303001930, 143802870)->(1564099269, -117294469)
Nov 28 18:54:28 np0005539279 kernel: registered taskstats version 1
Nov 28 18:54:28 np0005539279 kernel: Loading compiled-in X.509 certificates
Nov 28 18:54:28 np0005539279 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 28 18:54:28 np0005539279 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 28 18:54:28 np0005539279 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 28 18:54:28 np0005539279 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 28 18:54:28 np0005539279 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 28 18:54:28 np0005539279 kernel: Demotion targets for Node 0: null
Nov 28 18:54:28 np0005539279 kernel: page_owner is disabled
Nov 28 18:54:28 np0005539279 kernel: Key type .fscrypt registered
Nov 28 18:54:28 np0005539279 kernel: Key type fscrypt-provisioning registered
Nov 28 18:54:28 np0005539279 kernel: Key type big_key registered
Nov 28 18:54:28 np0005539279 kernel: Key type encrypted registered
Nov 28 18:54:28 np0005539279 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 28 18:54:28 np0005539279 kernel: Loading compiled-in module X.509 certificates
Nov 28 18:54:28 np0005539279 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 28 18:54:28 np0005539279 kernel: ima: Allocated hash algorithm: sha256
Nov 28 18:54:28 np0005539279 kernel: ima: No architecture policies found
Nov 28 18:54:28 np0005539279 kernel: evm: Initialising EVM extended attributes:
Nov 28 18:54:28 np0005539279 kernel: evm: security.selinux
Nov 28 18:54:28 np0005539279 kernel: evm: security.SMACK64 (disabled)
Nov 28 18:54:28 np0005539279 kernel: evm: security.SMACK64EXEC (disabled)
Nov 28 18:54:28 np0005539279 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 28 18:54:28 np0005539279 kernel: evm: security.SMACK64MMAP (disabled)
Nov 28 18:54:28 np0005539279 kernel: evm: security.apparmor (disabled)
Nov 28 18:54:28 np0005539279 kernel: evm: security.ima
Nov 28 18:54:28 np0005539279 kernel: evm: security.capability
Nov 28 18:54:28 np0005539279 kernel: evm: HMAC attrs: 0x1
Nov 28 18:54:28 np0005539279 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 28 18:54:28 np0005539279 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 28 18:54:28 np0005539279 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 28 18:54:28 np0005539279 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 28 18:54:28 np0005539279 kernel: usb 1-1: Manufacturer: QEMU
Nov 28 18:54:28 np0005539279 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 28 18:54:28 np0005539279 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 28 18:54:28 np0005539279 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 28 18:54:28 np0005539279 kernel: Running certificate verification RSA selftest
Nov 28 18:54:28 np0005539279 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 28 18:54:28 np0005539279 kernel: Running certificate verification ECDSA selftest
Nov 28 18:54:28 np0005539279 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 28 18:54:28 np0005539279 kernel: clk: Disabling unused clocks
Nov 28 18:54:28 np0005539279 kernel: Freeing unused decrypted memory: 2028K
Nov 28 18:54:28 np0005539279 kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 28 18:54:28 np0005539279 kernel: Write protecting the kernel read-only data: 30720k
Nov 28 18:54:28 np0005539279 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 28 18:54:28 np0005539279 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 28 18:54:28 np0005539279 kernel: Run /init as init process
Nov 28 18:54:28 np0005539279 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 18:54:28 np0005539279 systemd: Detected virtualization kvm.
Nov 28 18:54:28 np0005539279 systemd: Detected architecture x86-64.
Nov 28 18:54:28 np0005539279 systemd: Running in initrd.
Nov 28 18:54:28 np0005539279 systemd: No hostname configured, using default hostname.
Nov 28 18:54:28 np0005539279 systemd: Hostname set to <localhost>.
Nov 28 18:54:28 np0005539279 systemd: Initializing machine ID from VM UUID.
Nov 28 18:54:28 np0005539279 systemd: Queued start job for default target Initrd Default Target.
Nov 28 18:54:28 np0005539279 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 18:54:28 np0005539279 systemd: Reached target Local Encrypted Volumes.
Nov 28 18:54:28 np0005539279 systemd: Reached target Initrd /usr File System.
Nov 28 18:54:28 np0005539279 systemd: Reached target Local File Systems.
Nov 28 18:54:28 np0005539279 systemd: Reached target Path Units.
Nov 28 18:54:28 np0005539279 systemd: Reached target Slice Units.
Nov 28 18:54:28 np0005539279 systemd: Reached target Swaps.
Nov 28 18:54:28 np0005539279 systemd: Reached target Timer Units.
Nov 28 18:54:28 np0005539279 systemd: Listening on D-Bus System Message Bus Socket.
Nov 28 18:54:28 np0005539279 systemd: Listening on Journal Socket (/dev/log).
Nov 28 18:54:28 np0005539279 systemd: Listening on Journal Socket.
Nov 28 18:54:28 np0005539279 systemd: Listening on udev Control Socket.
Nov 28 18:54:28 np0005539279 systemd: Listening on udev Kernel Socket.
Nov 28 18:54:28 np0005539279 systemd: Reached target Socket Units.
Nov 28 18:54:28 np0005539279 systemd: Starting Create List of Static Device Nodes...
Nov 28 18:54:28 np0005539279 systemd: Starting Journal Service...
Nov 28 18:54:28 np0005539279 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 28 18:54:28 np0005539279 systemd: Starting Apply Kernel Variables...
Nov 28 18:54:28 np0005539279 systemd: Starting Create System Users...
Nov 28 18:54:28 np0005539279 systemd: Starting Setup Virtual Console...
Nov 28 18:54:28 np0005539279 systemd: Finished Create List of Static Device Nodes.
Nov 28 18:54:28 np0005539279 systemd: Finished Apply Kernel Variables.
Nov 28 18:54:28 np0005539279 systemd: Finished Create System Users.
Nov 28 18:54:28 np0005539279 systemd-journald[307]: Journal started
Nov 28 18:54:28 np0005539279 systemd-journald[307]: Runtime Journal (/run/log/journal/0b852fc4ac1f4ef6847b925a46032b4e) is 8.0M, max 153.6M, 145.6M free.
Nov 28 18:54:28 np0005539279 systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 28 18:54:28 np0005539279 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 28 18:54:28 np0005539279 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 28 18:54:28 np0005539279 systemd: Started Journal Service.
Nov 28 18:54:28 np0005539279 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 18:54:28 np0005539279 systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 18:54:28 np0005539279 systemd[1]: Finished Setup Virtual Console.
Nov 28 18:54:28 np0005539279 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 28 18:54:28 np0005539279 systemd[1]: Starting dracut cmdline hook...
Nov 28 18:54:28 np0005539279 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 18:54:28 np0005539279 dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Nov 28 18:54:28 np0005539279 dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 18:54:28 np0005539279 systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 18:54:28 np0005539279 systemd[1]: Finished dracut cmdline hook.
Nov 28 18:54:28 np0005539279 systemd[1]: Starting dracut pre-udev hook...
Nov 28 18:54:28 np0005539279 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 28 18:54:28 np0005539279 kernel: device-mapper: uevent: version 1.0.3
Nov 28 18:54:28 np0005539279 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 28 18:54:28 np0005539279 kernel: RPC: Registered named UNIX socket transport module.
Nov 28 18:54:28 np0005539279 kernel: RPC: Registered udp transport module.
Nov 28 18:54:28 np0005539279 kernel: RPC: Registered tcp transport module.
Nov 28 18:54:28 np0005539279 kernel: RPC: Registered tcp-with-tls transport module.
Nov 28 18:54:28 np0005539279 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 28 18:54:29 np0005539279 rpc.statd[443]: Version 2.5.4 starting
Nov 28 18:54:29 np0005539279 rpc.statd[443]: Initializing NSM state
Nov 28 18:54:29 np0005539279 rpc.idmapd[448]: Setting log level to 0
Nov 28 18:54:29 np0005539279 systemd[1]: Finished dracut pre-udev hook.
Nov 28 18:54:29 np0005539279 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 18:54:29 np0005539279 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 18:54:29 np0005539279 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 18:54:29 np0005539279 systemd[1]: Starting dracut pre-trigger hook...
Nov 28 18:54:29 np0005539279 systemd[1]: Finished dracut pre-trigger hook.
Nov 28 18:54:29 np0005539279 systemd[1]: Starting Coldplug All udev Devices...
Nov 28 18:54:29 np0005539279 systemd[1]: Created slice Slice /system/modprobe.
Nov 28 18:54:29 np0005539279 systemd[1]: Starting Load Kernel Module configfs...
Nov 28 18:54:29 np0005539279 systemd[1]: Finished Coldplug All udev Devices.
Nov 28 18:54:29 np0005539279 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 18:54:29 np0005539279 systemd[1]: Finished Load Kernel Module configfs.
Nov 28 18:54:29 np0005539279 systemd[1]: Mounting Kernel Configuration File System...
Nov 28 18:54:29 np0005539279 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 18:54:29 np0005539279 systemd[1]: Reached target Network.
Nov 28 18:54:29 np0005539279 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 18:54:29 np0005539279 systemd[1]: Starting dracut initqueue hook...
Nov 28 18:54:29 np0005539279 systemd[1]: Mounted Kernel Configuration File System.
Nov 28 18:54:29 np0005539279 systemd[1]: Reached target System Initialization.
Nov 28 18:54:29 np0005539279 systemd[1]: Reached target Basic System.
Nov 28 18:54:29 np0005539279 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 28 18:54:29 np0005539279 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 28 18:54:29 np0005539279 kernel: vda: vda1
Nov 28 18:54:29 np0005539279 kernel: scsi host0: ata_piix
Nov 28 18:54:29 np0005539279 kernel: scsi host1: ata_piix
Nov 28 18:54:29 np0005539279 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 28 18:54:29 np0005539279 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 28 18:54:29 np0005539279 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 28 18:54:29 np0005539279 systemd[1]: Reached target Initrd Root Device.
Nov 28 18:54:29 np0005539279 kernel: ata1: found unknown device (class 0)
Nov 28 18:54:29 np0005539279 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 28 18:54:29 np0005539279 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 28 18:54:29 np0005539279 systemd-udevd[497]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 18:54:29 np0005539279 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 28 18:54:29 np0005539279 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 28 18:54:29 np0005539279 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 28 18:54:29 np0005539279 systemd[1]: Finished dracut initqueue hook.
Nov 28 18:54:29 np0005539279 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 18:54:29 np0005539279 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 28 18:54:29 np0005539279 systemd[1]: Reached target Remote File Systems.
Nov 28 18:54:29 np0005539279 systemd[1]: Starting dracut pre-mount hook...
Nov 28 18:54:29 np0005539279 systemd[1]: Finished dracut pre-mount hook.
Nov 28 18:54:29 np0005539279 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 28 18:54:30 np0005539279 systemd-fsck[558]: /usr/sbin/fsck.xfs: XFS file system.
Nov 28 18:54:30 np0005539279 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 28 18:54:30 np0005539279 systemd[1]: Mounting /sysroot...
Nov 28 18:54:30 np0005539279 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 28 18:54:30 np0005539279 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 28 18:54:30 np0005539279 kernel: XFS (vda1): Ending clean mount
Nov 28 18:54:31 np0005539279 systemd[1]: Mounted /sysroot.
Nov 28 18:54:31 np0005539279 systemd[1]: Reached target Initrd Root File System.
Nov 28 18:54:31 np0005539279 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 28 18:54:31 np0005539279 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 28 18:54:31 np0005539279 systemd[1]: Reached target Initrd File Systems.
Nov 28 18:54:31 np0005539279 systemd[1]: Reached target Initrd Default Target.
Nov 28 18:54:31 np0005539279 systemd[1]: Starting dracut mount hook...
Nov 28 18:54:31 np0005539279 systemd[1]: Finished dracut mount hook.
Nov 28 18:54:31 np0005539279 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 28 18:54:31 np0005539279 rpc.idmapd[448]: exiting on signal 15
Nov 28 18:54:31 np0005539279 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 28 18:54:31 np0005539279 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Network.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Timer Units.
Nov 28 18:54:31 np0005539279 systemd[1]: dbus.socket: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 28 18:54:31 np0005539279 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Initrd Default Target.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Basic System.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Initrd Root Device.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Initrd /usr File System.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Path Units.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Remote File Systems.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Slice Units.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Socket Units.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target System Initialization.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Local File Systems.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Swaps.
Nov 28 18:54:31 np0005539279 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped dracut mount hook.
Nov 28 18:54:31 np0005539279 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped dracut pre-mount hook.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 28 18:54:31 np0005539279 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped dracut initqueue hook.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Apply Kernel Variables.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Coldplug All udev Devices.
Nov 28 18:54:31 np0005539279 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped dracut pre-trigger hook.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Setup Virtual Console.
Nov 28 18:54:31 np0005539279 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-udevd.service: Consumed 1.115s CPU time.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Closed udev Control Socket.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Closed udev Kernel Socket.
Nov 28 18:54:31 np0005539279 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped dracut pre-udev hook.
Nov 28 18:54:31 np0005539279 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped dracut cmdline hook.
Nov 28 18:54:31 np0005539279 systemd[1]: Starting Cleanup udev Database...
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 28 18:54:31 np0005539279 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 28 18:54:31 np0005539279 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Stopped Create System Users.
Nov 28 18:54:31 np0005539279 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 28 18:54:31 np0005539279 systemd[1]: Finished Cleanup udev Database.
Nov 28 18:54:31 np0005539279 systemd[1]: Reached target Switch Root.
Nov 28 18:54:31 np0005539279 systemd[1]: Starting Switch Root...
Nov 28 18:54:31 np0005539279 systemd[1]: Switching root.
Nov 28 18:54:31 np0005539279 systemd-journald[307]: Journal stopped
Nov 28 18:54:32 np0005539279 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 28 18:54:32 np0005539279 kernel: audit: type=1404 audit(1764374072.059:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 28 18:54:32 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 18:54:32 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 18:54:32 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 18:54:32 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 18:54:32 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 18:54:32 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 18:54:32 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 18:54:32 np0005539279 kernel: audit: type=1403 audit(1764374072.194:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 28 18:54:32 np0005539279 systemd: Successfully loaded SELinux policy in 138.798ms.
Nov 28 18:54:32 np0005539279 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.231ms.
Nov 28 18:54:32 np0005539279 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 18:54:32 np0005539279 systemd: Detected virtualization kvm.
Nov 28 18:54:32 np0005539279 systemd: Detected architecture x86-64.
Nov 28 18:54:32 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 18:54:32 np0005539279 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 28 18:54:32 np0005539279 systemd: Stopped Switch Root.
Nov 28 18:54:32 np0005539279 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 28 18:54:32 np0005539279 systemd: Created slice Slice /system/getty.
Nov 28 18:54:32 np0005539279 systemd: Created slice Slice /system/serial-getty.
Nov 28 18:54:32 np0005539279 systemd: Created slice Slice /system/sshd-keygen.
Nov 28 18:54:32 np0005539279 systemd: Created slice User and Session Slice.
Nov 28 18:54:32 np0005539279 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 18:54:32 np0005539279 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 28 18:54:32 np0005539279 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 28 18:54:32 np0005539279 systemd: Reached target Local Encrypted Volumes.
Nov 28 18:54:32 np0005539279 systemd: Stopped target Switch Root.
Nov 28 18:54:32 np0005539279 systemd: Stopped target Initrd File Systems.
Nov 28 18:54:32 np0005539279 systemd: Stopped target Initrd Root File System.
Nov 28 18:54:32 np0005539279 systemd: Reached target Local Integrity Protected Volumes.
Nov 28 18:54:32 np0005539279 systemd: Reached target Path Units.
Nov 28 18:54:32 np0005539279 systemd: Reached target rpc_pipefs.target.
Nov 28 18:54:32 np0005539279 systemd: Reached target Slice Units.
Nov 28 18:54:32 np0005539279 systemd: Reached target Swaps.
Nov 28 18:54:32 np0005539279 systemd: Reached target Local Verity Protected Volumes.
Nov 28 18:54:32 np0005539279 systemd: Listening on RPCbind Server Activation Socket.
Nov 28 18:54:32 np0005539279 systemd: Reached target RPC Port Mapper.
Nov 28 18:54:32 np0005539279 systemd: Listening on Process Core Dump Socket.
Nov 28 18:54:32 np0005539279 systemd: Listening on initctl Compatibility Named Pipe.
Nov 28 18:54:32 np0005539279 systemd: Listening on udev Control Socket.
Nov 28 18:54:32 np0005539279 systemd: Listening on udev Kernel Socket.
Nov 28 18:54:32 np0005539279 systemd: Mounting Huge Pages File System...
Nov 28 18:54:32 np0005539279 systemd: Mounting POSIX Message Queue File System...
Nov 28 18:54:32 np0005539279 systemd: Mounting Kernel Debug File System...
Nov 28 18:54:32 np0005539279 systemd: Mounting Kernel Trace File System...
Nov 28 18:54:32 np0005539279 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 18:54:32 np0005539279 systemd: Starting Create List of Static Device Nodes...
Nov 28 18:54:32 np0005539279 systemd: Starting Load Kernel Module configfs...
Nov 28 18:54:32 np0005539279 systemd: Starting Load Kernel Module drm...
Nov 28 18:54:32 np0005539279 systemd: Starting Load Kernel Module efi_pstore...
Nov 28 18:54:32 np0005539279 systemd: Starting Load Kernel Module fuse...
Nov 28 18:54:32 np0005539279 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 28 18:54:32 np0005539279 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 28 18:54:32 np0005539279 systemd: Stopped File System Check on Root Device.
Nov 28 18:54:32 np0005539279 systemd: Stopped Journal Service.
Nov 28 18:54:32 np0005539279 kernel: fuse: init (API version 7.37)
Nov 28 18:54:32 np0005539279 systemd: Starting Journal Service...
Nov 28 18:54:32 np0005539279 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 28 18:54:32 np0005539279 systemd: Starting Generate network units from Kernel command line...
Nov 28 18:54:32 np0005539279 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 18:54:32 np0005539279 systemd: Starting Remount Root and Kernel File Systems...
Nov 28 18:54:32 np0005539279 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 28 18:54:32 np0005539279 systemd: Starting Apply Kernel Variables...
Nov 28 18:54:32 np0005539279 systemd: Starting Coldplug All udev Devices...
Nov 28 18:54:32 np0005539279 systemd: Mounted Huge Pages File System.
Nov 28 18:54:32 np0005539279 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 28 18:54:32 np0005539279 systemd-journald[680]: Journal started
Nov 28 18:54:32 np0005539279 systemd-journald[680]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 28 18:54:32 np0005539279 systemd[1]: Queued start job for default target Multi-User System.
Nov 28 18:54:32 np0005539279 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 28 18:54:32 np0005539279 systemd: Started Journal Service.
Nov 28 18:54:32 np0005539279 systemd[1]: Mounted POSIX Message Queue File System.
Nov 28 18:54:32 np0005539279 systemd[1]: Mounted Kernel Debug File System.
Nov 28 18:54:32 np0005539279 systemd[1]: Mounted Kernel Trace File System.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Create List of Static Device Nodes.
Nov 28 18:54:32 np0005539279 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Load Kernel Module configfs.
Nov 28 18:54:32 np0005539279 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 28 18:54:32 np0005539279 kernel: ACPI: bus type drm_connector registered
Nov 28 18:54:32 np0005539279 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Load Kernel Module fuse.
Nov 28 18:54:32 np0005539279 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Load Kernel Module drm.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Generate network units from Kernel command line.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 28 18:54:32 np0005539279 systemd[1]: Mounting FUSE Control File System...
Nov 28 18:54:32 np0005539279 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 18:54:32 np0005539279 systemd[1]: Starting Rebuild Hardware Database...
Nov 28 18:54:32 np0005539279 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 28 18:54:32 np0005539279 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 28 18:54:32 np0005539279 systemd[1]: Starting Load/Save OS Random Seed...
Nov 28 18:54:32 np0005539279 systemd[1]: Starting Create System Users...
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Apply Kernel Variables.
Nov 28 18:54:32 np0005539279 systemd[1]: Mounted FUSE Control File System.
Nov 28 18:54:32 np0005539279 systemd-journald[680]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 28 18:54:32 np0005539279 systemd-journald[680]: Received client request to flush runtime journal.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 28 18:54:32 np0005539279 systemd[1]: Finished Coldplug All udev Devices.
Nov 28 18:54:33 np0005539279 systemd[1]: Finished Create System Users.
Nov 28 18:54:33 np0005539279 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 18:54:33 np0005539279 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 18:54:33 np0005539279 systemd[1]: Reached target Preparation for Local File Systems.
Nov 28 18:54:33 np0005539279 systemd[1]: Reached target Local File Systems.
Nov 28 18:54:33 np0005539279 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 28 18:54:33 np0005539279 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 28 18:54:33 np0005539279 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 28 18:54:33 np0005539279 systemd[1]: Starting Automatic Boot Loader Update...
Nov 28 18:54:33 np0005539279 systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 18:54:33 np0005539279 systemd[1]: Finished Load/Save OS Random Seed.
Nov 28 18:54:33 np0005539279 bootctl[696]: Couldn't find EFI system partition, skipping.
Nov 28 18:54:33 np0005539279 systemd[1]: Finished Automatic Boot Loader Update.
Nov 28 18:54:33 np0005539279 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 18:54:33 np0005539279 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 28 18:54:33 np0005539279 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 28 18:54:33 np0005539279 systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 18:54:33 np0005539279 systemd[1]: Starting Security Auditing Service...
Nov 28 18:54:33 np0005539279 systemd[1]: Starting RPC Bind...
Nov 28 18:54:33 np0005539279 systemd[1]: Starting Rebuild Journal Catalog...
Nov 28 18:54:33 np0005539279 auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 28 18:54:33 np0005539279 auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 28 18:54:33 np0005539279 systemd[1]: Finished Rebuild Journal Catalog.
Nov 28 18:54:33 np0005539279 systemd[1]: Started RPC Bind.
Nov 28 18:54:33 np0005539279 augenrules[707]: /sbin/augenrules: No change
Nov 28 18:54:33 np0005539279 augenrules[723]: No rules
Nov 28 18:54:33 np0005539279 augenrules[723]: enabled 1
Nov 28 18:54:33 np0005539279 augenrules[723]: failure 1
Nov 28 18:54:33 np0005539279 augenrules[723]: pid 702
Nov 28 18:54:33 np0005539279 augenrules[723]: rate_limit 0
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_limit 8192
Nov 28 18:54:33 np0005539279 augenrules[723]: lost 0
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog 4
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_wait_time 60000
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_wait_time_actual 0
Nov 28 18:54:33 np0005539279 augenrules[723]: enabled 1
Nov 28 18:54:33 np0005539279 augenrules[723]: failure 1
Nov 28 18:54:33 np0005539279 augenrules[723]: pid 702
Nov 28 18:54:33 np0005539279 augenrules[723]: rate_limit 0
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_limit 8192
Nov 28 18:54:33 np0005539279 augenrules[723]: lost 0
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog 0
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_wait_time 60000
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_wait_time_actual 0
Nov 28 18:54:33 np0005539279 augenrules[723]: enabled 1
Nov 28 18:54:33 np0005539279 augenrules[723]: failure 1
Nov 28 18:54:33 np0005539279 augenrules[723]: pid 702
Nov 28 18:54:33 np0005539279 augenrules[723]: rate_limit 0
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_limit 8192
Nov 28 18:54:33 np0005539279 augenrules[723]: lost 0
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog 0
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_wait_time 60000
Nov 28 18:54:33 np0005539279 augenrules[723]: backlog_wait_time_actual 0
Nov 28 18:54:33 np0005539279 systemd[1]: Started Security Auditing Service.
Nov 28 18:54:33 np0005539279 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 28 18:54:33 np0005539279 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 28 18:54:34 np0005539279 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 28 18:54:34 np0005539279 systemd[1]: Finished Rebuild Hardware Database.
Nov 28 18:54:34 np0005539279 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 18:54:34 np0005539279 systemd[1]: Starting Update is Completed...
Nov 28 18:54:34 np0005539279 systemd[1]: Finished Update is Completed.
Nov 28 18:54:34 np0005539279 systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 18:54:34 np0005539279 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 18:54:34 np0005539279 systemd[1]: Reached target System Initialization.
Nov 28 18:54:34 np0005539279 systemd[1]: Started dnf makecache --timer.
Nov 28 18:54:34 np0005539279 systemd[1]: Started Daily rotation of log files.
Nov 28 18:54:34 np0005539279 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 28 18:54:34 np0005539279 systemd[1]: Reached target Timer Units.
Nov 28 18:54:34 np0005539279 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 28 18:54:34 np0005539279 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 28 18:54:34 np0005539279 systemd[1]: Reached target Socket Units.
Nov 28 18:54:34 np0005539279 systemd[1]: Starting D-Bus System Message Bus...
Nov 28 18:54:34 np0005539279 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 18:54:34 np0005539279 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 28 18:54:34 np0005539279 systemd[1]: Starting Load Kernel Module configfs...
Nov 28 18:54:34 np0005539279 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 18:54:34 np0005539279 systemd[1]: Finished Load Kernel Module configfs.
Nov 28 18:54:34 np0005539279 systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 18:54:35 np0005539279 systemd[1]: Started D-Bus System Message Bus.
Nov 28 18:54:35 np0005539279 systemd[1]: Reached target Basic System.
Nov 28 18:54:35 np0005539279 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 28 18:54:35 np0005539279 dbus-broker-lau[767]: Ready
Nov 28 18:54:35 np0005539279 systemd[1]: Starting NTP client/server...
Nov 28 18:54:35 np0005539279 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 28 18:54:35 np0005539279 chronyd[781]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 18:54:35 np0005539279 chronyd[781]: Loaded 0 symmetric keys
Nov 28 18:54:35 np0005539279 chronyd[781]: Using right/UTC timezone to obtain leap second data
Nov 28 18:54:35 np0005539279 chronyd[781]: Loaded seccomp filter (level 2)
Nov 28 18:54:35 np0005539279 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 28 18:54:35 np0005539279 systemd[1]: Starting IPv4 firewall with iptables...
Nov 28 18:54:35 np0005539279 systemd[1]: Started irqbalance daemon.
Nov 28 18:54:35 np0005539279 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 28 18:54:35 np0005539279 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 18:54:35 np0005539279 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 18:54:35 np0005539279 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 18:54:35 np0005539279 systemd[1]: Reached target sshd-keygen.target.
Nov 28 18:54:35 np0005539279 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 28 18:54:35 np0005539279 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 28 18:54:35 np0005539279 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 28 18:54:35 np0005539279 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 28 18:54:35 np0005539279 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 28 18:54:35 np0005539279 kernel: kvm_amd: TSC scaling supported
Nov 28 18:54:35 np0005539279 kernel: kvm_amd: Nested Virtualization enabled
Nov 28 18:54:35 np0005539279 kernel: kvm_amd: Nested Paging enabled
Nov 28 18:54:35 np0005539279 kernel: kvm_amd: LBR virtualization supported
Nov 28 18:54:35 np0005539279 kernel: Console: switching to colour dummy device 80x25
Nov 28 18:54:35 np0005539279 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 28 18:54:35 np0005539279 systemd[1]: Reached target User and Group Name Lookups.
Nov 28 18:54:35 np0005539279 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 28 18:54:35 np0005539279 kernel: [drm] features: -context_init
Nov 28 18:54:35 np0005539279 kernel: [drm] number of scanouts: 1
Nov 28 18:54:35 np0005539279 kernel: [drm] number of cap sets: 0
Nov 28 18:54:35 np0005539279 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 28 18:54:35 np0005539279 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 28 18:54:35 np0005539279 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 28 18:54:35 np0005539279 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 28 18:54:35 np0005539279 kernel: Console: switching to colour frame buffer device 128x48
Nov 28 18:54:35 np0005539279 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 28 18:54:35 np0005539279 systemd[1]: Starting User Login Management...
Nov 28 18:54:35 np0005539279 systemd[1]: Started NTP client/server.
Nov 28 18:54:35 np0005539279 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 28 18:54:35 np0005539279 systemd-logind[811]: New seat seat0.
Nov 28 18:54:35 np0005539279 systemd-logind[811]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 18:54:35 np0005539279 systemd-logind[811]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 18:54:35 np0005539279 systemd[1]: Started User Login Management.
Nov 28 18:54:35 np0005539279 iptables.init[790]: iptables: Applying firewall rules: [  OK  ]
Nov 28 18:54:35 np0005539279 systemd[1]: Finished IPv4 firewall with iptables.
Nov 28 18:54:35 np0005539279 cloud-init[839]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 28 Nov 2025 23:54:35 +0000. Up 9.32 seconds.
Nov 28 18:54:35 np0005539279 systemd[1]: run-cloud\x2dinit-tmp-tmpwxndediw.mount: Deactivated successfully.
Nov 28 18:54:35 np0005539279 systemd[1]: Starting Hostname Service...
Nov 28 18:54:35 np0005539279 systemd[1]: Started Hostname Service.
Nov 28 18:54:35 np0005539279 systemd-hostnamed[854]: Hostname set to <np0005539279.novalocal> (static)
Nov 28 18:54:36 np0005539279 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 28 18:54:36 np0005539279 systemd[1]: Reached target Preparation for Network.
Nov 28 18:54:36 np0005539279 systemd[1]: Starting Network Manager...
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.1701] NetworkManager (version 1.54.1-1.el9) is starting... (boot:5b517f8a-0dc6-4308-b13f-84fe445a9842)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.1708] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.1797] manager[0x55e2e8710080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.1851] hostname: hostname: using hostnamed
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.1851] hostname: static hostname changed from (none) to "np0005539279.novalocal"
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.1857] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2024] manager[0x55e2e8710080]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2026] manager[0x55e2e8710080]: rfkill: WWAN hardware radio set enabled
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2073] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2073] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2074] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2075] manager: Networking is enabled by state file
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2076] settings: Loaded settings plugin: keyfile (internal)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2095] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 18:54:36 np0005539279 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2130] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2154] dhcp: init: Using DHCP client 'internal'
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2157] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2180] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2191] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2203] device (lo): Activation: starting connection 'lo' (abb2dd2c-7d4b-48ba-b333-c50c8b96e666)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2216] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2222] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2280] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2286] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2289] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2291] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2294] device (eth0): carrier: link connected
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2296] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 18:54:36 np0005539279 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2304] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2312] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2317] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2318] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2320] manager: NetworkManager state is now CONNECTING
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2322] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2328] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2332] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 18:54:36 np0005539279 systemd[1]: Started Network Manager.
Nov 28 18:54:36 np0005539279 systemd[1]: Reached target Network.
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2385] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2394] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 18:54:36 np0005539279 systemd[1]: Starting Network Manager Wait Online...
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2420] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 18:54:36 np0005539279 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 28 18:54:36 np0005539279 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2588] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2592] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2599] device (lo): Activation: successful, device activated.
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2620] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2623] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2627] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2631] device (eth0): Activation: successful, device activated.
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2638] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 18:54:36 np0005539279 NetworkManager[858]: <info>  [1764374076.2641] manager: startup complete
Nov 28 18:54:36 np0005539279 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 28 18:54:36 np0005539279 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 18:54:36 np0005539279 systemd[1]: Reached target NFS client services.
Nov 28 18:54:36 np0005539279 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 18:54:36 np0005539279 systemd[1]: Reached target Remote File Systems.
Nov 28 18:54:36 np0005539279 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 18:54:36 np0005539279 systemd[1]: Finished Network Manager Wait Online.
Nov 28 18:54:36 np0005539279 systemd[1]: Starting Cloud-init: Network Stage...
Nov 28 18:54:36 np0005539279 cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 28 Nov 2025 23:54:36 +0000. Up 10.33 seconds.
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.110         | 255.255.255.0 | global | fa:16:3e:74:b9:7a |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe74:b97a/64 |       .       |  link  | fa:16:3e:74:b9:7a |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 28 18:54:36 np0005539279 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 18:54:37 np0005539279 cloud-init[922]: Generating public/private rsa key pair.
Nov 28 18:54:37 np0005539279 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 28 18:54:37 np0005539279 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 28 18:54:37 np0005539279 cloud-init[922]: The key fingerprint is:
Nov 28 18:54:37 np0005539279 cloud-init[922]: SHA256:WuXpnvBj98HBL8aNYqAvK3qm7/mTdlGMt6jGeDYbATE root@np0005539279.novalocal
Nov 28 18:54:37 np0005539279 cloud-init[922]: The key's randomart image is:
Nov 28 18:54:37 np0005539279 cloud-init[922]: +---[RSA 3072]----+
Nov 28 18:54:37 np0005539279 cloud-init[922]: |      E          |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |       o         |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |      .   .o     |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |       . o..+.   |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |        S ++ .o  |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |       o +o..o = |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |      .o+o..o B o|
Nov 28 18:54:37 np0005539279 cloud-init[922]: |      =.%*+o.o o |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |    oB+B+O=o ..  |
Nov 28 18:54:37 np0005539279 cloud-init[922]: +----[SHA256]-----+
Nov 28 18:54:37 np0005539279 cloud-init[922]: Generating public/private ecdsa key pair.
Nov 28 18:54:37 np0005539279 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 28 18:54:37 np0005539279 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 28 18:54:37 np0005539279 cloud-init[922]: The key fingerprint is:
Nov 28 18:54:37 np0005539279 cloud-init[922]: SHA256:mUKWUjKzdI1iFsnaEMiod9b1hM7acCPU3RZZzop+m+E root@np0005539279.novalocal
Nov 28 18:54:37 np0005539279 cloud-init[922]: The key's randomart image is:
Nov 28 18:54:37 np0005539279 cloud-init[922]: +---[ECDSA 256]---+
Nov 28 18:54:37 np0005539279 cloud-init[922]: |o..o*oo+ o ..+.  |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |o...*Bo.= o +o   |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |.  *o+++ o .  o  |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |. o +++ =o.. .   |
Nov 28 18:54:37 np0005539279 cloud-init[922]: | . o  .*S.. .    |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |      ....       |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |          . o    |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |           o +   |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |            E    |
Nov 28 18:54:37 np0005539279 cloud-init[922]: +----[SHA256]-----+
Nov 28 18:54:37 np0005539279 cloud-init[922]: Generating public/private ed25519 key pair.
Nov 28 18:54:37 np0005539279 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 28 18:54:37 np0005539279 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 28 18:54:37 np0005539279 cloud-init[922]: The key fingerprint is:
Nov 28 18:54:37 np0005539279 cloud-init[922]: SHA256:Z5aZrxe1J+YzD1cD2uBOSNeqAaXrXMapKAOvCfamEzk root@np0005539279.novalocal
Nov 28 18:54:37 np0005539279 cloud-init[922]: The key's randomart image is:
Nov 28 18:54:37 np0005539279 cloud-init[922]: +--[ED25519 256]--+
Nov 28 18:54:37 np0005539279 cloud-init[922]: |         .       |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |        o   .    |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |       o . o o   |
Nov 28 18:54:37 np0005539279 cloud-init[922]: |        = =+= o  |
Nov 28 18:54:37 np0005539279 cloud-init[922]: | ..    .SBB= o o.|
Nov 28 18:54:37 np0005539279 cloud-init[922]: | Eo   + ++=.. + +|
Nov 28 18:54:37 np0005539279 cloud-init[922]: |..o+ . + . ..+.o.|
Nov 28 18:54:37 np0005539279 cloud-init[922]: |o.+.o      .. +o |
Nov 28 18:54:37 np0005539279 cloud-init[922]: | ++.      ..   +.|
Nov 28 18:54:37 np0005539279 cloud-init[922]: +----[SHA256]-----+
Nov 28 18:54:38 np0005539279 sm-notify[1005]: Version 2.5.4 starting
Nov 28 18:54:38 np0005539279 systemd[1]: Finished Cloud-init: Network Stage.
Nov 28 18:54:38 np0005539279 systemd[1]: Reached target Cloud-config availability.
Nov 28 18:54:38 np0005539279 systemd[1]: Reached target Network is Online.
Nov 28 18:54:38 np0005539279 systemd[1]: Starting Cloud-init: Config Stage...
Nov 28 18:54:38 np0005539279 systemd[1]: Starting Crash recovery kernel arming...
Nov 28 18:54:38 np0005539279 systemd[1]: Starting Notify NFS peers of a restart...
Nov 28 18:54:38 np0005539279 systemd[1]: Starting System Logging Service...
Nov 28 18:54:38 np0005539279 systemd[1]: Starting OpenSSH server daemon...
Nov 28 18:54:38 np0005539279 systemd[1]: Starting Permit User Sessions...
Nov 28 18:54:38 np0005539279 systemd[1]: Started Notify NFS peers of a restart.
Nov 28 18:54:38 np0005539279 systemd[1]: Finished Permit User Sessions.
Nov 28 18:54:38 np0005539279 systemd[1]: Started Command Scheduler.
Nov 28 18:54:38 np0005539279 systemd[1]: Started Getty on tty1.
Nov 28 18:54:38 np0005539279 systemd[1]: Started Serial Getty on ttyS0.
Nov 28 18:54:38 np0005539279 systemd[1]: Reached target Login Prompts.
Nov 28 18:54:38 np0005539279 systemd[1]: Started OpenSSH server daemon.
Nov 28 18:54:38 np0005539279 rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Nov 28 18:54:38 np0005539279 rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 28 18:54:38 np0005539279 systemd[1]: Started System Logging Service.
Nov 28 18:54:38 np0005539279 systemd[1]: Reached target Multi-User System.
Nov 28 18:54:38 np0005539279 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 28 18:54:38 np0005539279 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 28 18:54:38 np0005539279 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 28 18:54:38 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 18:54:38 np0005539279 kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Nov 28 18:54:38 np0005539279 kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 28 18:54:38 np0005539279 cloud-init[1134]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 28 Nov 2025 23:54:38 +0000. Up 12.05 seconds.
Nov 28 18:54:38 np0005539279 systemd[1]: Finished Cloud-init: Config Stage.
Nov 28 18:54:38 np0005539279 systemd[1]: Starting Cloud-init: Final Stage...
Nov 28 18:54:38 np0005539279 dracut[1286]: dracut-057-102.git20250818.el9
Nov 28 18:54:38 np0005539279 cloud-init[1302]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 28 Nov 2025 23:54:38 +0000. Up 12.46 seconds.
Nov 28 18:54:38 np0005539279 cloud-init[1304]: #############################################################
Nov 28 18:54:38 np0005539279 cloud-init[1305]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 28 18:54:38 np0005539279 cloud-init[1307]: 256 SHA256:mUKWUjKzdI1iFsnaEMiod9b1hM7acCPU3RZZzop+m+E root@np0005539279.novalocal (ECDSA)
Nov 28 18:54:38 np0005539279 cloud-init[1309]: 256 SHA256:Z5aZrxe1J+YzD1cD2uBOSNeqAaXrXMapKAOvCfamEzk root@np0005539279.novalocal (ED25519)
Nov 28 18:54:38 np0005539279 cloud-init[1311]: 3072 SHA256:WuXpnvBj98HBL8aNYqAvK3qm7/mTdlGMt6jGeDYbATE root@np0005539279.novalocal (RSA)
Nov 28 18:54:38 np0005539279 cloud-init[1312]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 28 18:54:38 np0005539279 cloud-init[1313]: #############################################################
Nov 28 18:54:38 np0005539279 cloud-init[1302]: Cloud-init v. 24.4-7.el9 finished at Fri, 28 Nov 2025 23:54:38 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 12.65 seconds
Nov 28 18:54:38 np0005539279 dracut[1288]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 28 18:54:38 np0005539279 systemd[1]: Finished Cloud-init: Final Stage.
Nov 28 18:54:38 np0005539279 systemd[1]: Reached target Cloud-init target.
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 18:54:39 np0005539279 dracut[1288]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: memstrack is not available
Nov 28 18:54:40 np0005539279 dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 18:54:40 np0005539279 dracut[1288]: memstrack is not available
Nov 28 18:54:40 np0005539279 dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 18:54:40 np0005539279 dracut[1288]: *** Including module: systemd ***
Nov 28 18:54:40 np0005539279 dracut[1288]: *** Including module: fips ***
Nov 28 18:54:41 np0005539279 chronyd[781]: Selected source 206.108.0.131 (2.centos.pool.ntp.org)
Nov 28 18:54:41 np0005539279 chronyd[781]: System clock TAI offset set to 37 seconds
Nov 28 18:54:41 np0005539279 dracut[1288]: *** Including module: systemd-initrd ***
Nov 28 18:54:41 np0005539279 dracut[1288]: *** Including module: i18n ***
Nov 28 18:54:41 np0005539279 dracut[1288]: *** Including module: drm ***
Nov 28 18:54:42 np0005539279 dracut[1288]: *** Including module: prefixdevname ***
Nov 28 18:54:42 np0005539279 dracut[1288]: *** Including module: kernel-modules ***
Nov 28 18:54:42 np0005539279 kernel: block vda: the capability attribute has been deprecated.
Nov 28 18:54:42 np0005539279 dracut[1288]: *** Including module: kernel-modules-extra ***
Nov 28 18:54:42 np0005539279 dracut[1288]: *** Including module: qemu ***
Nov 28 18:54:42 np0005539279 dracut[1288]: *** Including module: fstab-sys ***
Nov 28 18:54:42 np0005539279 dracut[1288]: *** Including module: rootfs-block ***
Nov 28 18:54:42 np0005539279 dracut[1288]: *** Including module: terminfo ***
Nov 28 18:54:42 np0005539279 dracut[1288]: *** Including module: udev-rules ***
Nov 28 18:54:43 np0005539279 dracut[1288]: Skipping udev rule: 91-permissions.rules
Nov 28 18:54:43 np0005539279 dracut[1288]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 28 18:54:43 np0005539279 dracut[1288]: *** Including module: virtiofs ***
Nov 28 18:54:43 np0005539279 dracut[1288]: *** Including module: dracut-systemd ***
Nov 28 18:54:44 np0005539279 dracut[1288]: *** Including module: usrmount ***
Nov 28 18:54:44 np0005539279 dracut[1288]: *** Including module: base ***
Nov 28 18:54:44 np0005539279 dracut[1288]: *** Including module: fs-lib ***
Nov 28 18:54:44 np0005539279 dracut[1288]: *** Including module: kdumpbase ***
Nov 28 18:54:44 np0005539279 dracut[1288]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 28 18:54:44 np0005539279 dracut[1288]:  microcode_ctl module: mangling fw_dir
Nov 28 18:54:44 np0005539279 dracut[1288]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 28 18:54:44 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 28 18:54:44 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel" is ignored
Nov 28 18:54:44 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 28 18:54:44 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 28 18:54:44 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 28 18:54:44 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 28 18:54:44 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 28 18:54:45 np0005539279 irqbalance[796]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 28 18:54:45 np0005539279 irqbalance[796]: IRQ 25 affinity is now unmanaged
Nov 28 18:54:45 np0005539279 irqbalance[796]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 28 18:54:45 np0005539279 irqbalance[796]: IRQ 31 affinity is now unmanaged
Nov 28 18:54:45 np0005539279 irqbalance[796]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 28 18:54:45 np0005539279 irqbalance[796]: IRQ 28 affinity is now unmanaged
Nov 28 18:54:45 np0005539279 irqbalance[796]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 28 18:54:45 np0005539279 irqbalance[796]: IRQ 32 affinity is now unmanaged
Nov 28 18:54:45 np0005539279 irqbalance[796]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 28 18:54:45 np0005539279 irqbalance[796]: IRQ 30 affinity is now unmanaged
Nov 28 18:54:45 np0005539279 irqbalance[796]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 28 18:54:45 np0005539279 irqbalance[796]: IRQ 29 affinity is now unmanaged
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 28 18:54:45 np0005539279 dracut[1288]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 28 18:54:45 np0005539279 dracut[1288]: *** Including module: openssl ***
Nov 28 18:54:45 np0005539279 dracut[1288]: *** Including module: shutdown ***
Nov 28 18:54:45 np0005539279 dracut[1288]: *** Including module: squash ***
Nov 28 18:54:45 np0005539279 dracut[1288]: *** Including modules done ***
Nov 28 18:54:45 np0005539279 dracut[1288]: *** Installing kernel module dependencies ***
Nov 28 18:54:46 np0005539279 dracut[1288]: *** Installing kernel module dependencies done ***
Nov 28 18:54:46 np0005539279 dracut[1288]: *** Resolving executable dependencies ***
Nov 28 18:54:46 np0005539279 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 18:54:48 np0005539279 dracut[1288]: *** Resolving executable dependencies done ***
Nov 28 18:54:48 np0005539279 dracut[1288]: *** Generating early-microcode cpio image ***
Nov 28 18:54:48 np0005539279 dracut[1288]: *** Store current command line parameters ***
Nov 28 18:54:48 np0005539279 dracut[1288]: Stored kernel commandline:
Nov 28 18:54:48 np0005539279 dracut[1288]: No dracut internal kernel commandline stored in the initramfs
Nov 28 18:54:48 np0005539279 dracut[1288]: *** Install squash loader ***
Nov 28 18:54:50 np0005539279 dracut[1288]: *** Squashing the files inside the initramfs ***
Nov 28 18:54:51 np0005539279 dracut[1288]: *** Squashing the files inside the initramfs done ***
Nov 28 18:54:51 np0005539279 dracut[1288]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 28 18:54:51 np0005539279 dracut[1288]: *** Hardlinking files ***
Nov 28 18:54:51 np0005539279 dracut[1288]: *** Hardlinking files done ***
Nov 28 18:54:51 np0005539279 dracut[1288]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 28 18:54:52 np0005539279 kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Nov 28 18:54:52 np0005539279 kdumpctl[1018]: kdump: Starting kdump: [OK]
Nov 28 18:54:52 np0005539279 systemd[1]: Finished Crash recovery kernel arming.
Nov 28 18:54:52 np0005539279 systemd[1]: Startup finished in 1.839s (kernel) + 3.975s (initrd) + 20.886s (userspace) = 26.702s.
Nov 28 18:55:06 np0005539279 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 18:55:57 np0005539279 systemd[1]: Created slice User Slice of UID 1000.
Nov 28 18:55:58 np0005539279 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 28 18:55:58 np0005539279 systemd-logind[811]: New session 1 of user zuul.
Nov 28 18:55:58 np0005539279 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 28 18:55:58 np0005539279 systemd[1]: Starting User Manager for UID 1000...
Nov 28 18:55:58 np0005539279 systemd[4302]: Queued start job for default target Main User Target.
Nov 28 18:55:58 np0005539279 systemd[4302]: Created slice User Application Slice.
Nov 28 18:55:58 np0005539279 systemd[4302]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 18:55:58 np0005539279 systemd[4302]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 18:55:58 np0005539279 systemd[4302]: Reached target Paths.
Nov 28 18:55:58 np0005539279 systemd[4302]: Reached target Timers.
Nov 28 18:55:58 np0005539279 systemd[4302]: Starting D-Bus User Message Bus Socket...
Nov 28 18:55:58 np0005539279 systemd[4302]: Starting Create User's Volatile Files and Directories...
Nov 28 18:55:58 np0005539279 systemd[4302]: Finished Create User's Volatile Files and Directories.
Nov 28 18:55:58 np0005539279 systemd[4302]: Listening on D-Bus User Message Bus Socket.
Nov 28 18:55:58 np0005539279 systemd[4302]: Reached target Sockets.
Nov 28 18:55:58 np0005539279 systemd[4302]: Reached target Basic System.
Nov 28 18:55:58 np0005539279 systemd[4302]: Reached target Main User Target.
Nov 28 18:55:58 np0005539279 systemd[4302]: Startup finished in 159ms.
Nov 28 18:55:58 np0005539279 systemd[1]: Started User Manager for UID 1000.
Nov 28 18:55:58 np0005539279 systemd[1]: Started Session 1 of User zuul.
Nov 28 18:55:58 np0005539279 python3[4384]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 18:56:01 np0005539279 python3[4412]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 18:56:07 np0005539279 python3[4470]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 18:56:08 np0005539279 python3[4510]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 28 18:56:09 np0005539279 python3[4536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD160DmVuc1zPWyZS+AULjgMAMcTj/LIRwiud7eMjgtmzZvdJ+BAlHf4LsUzgMGrcZlY8oefEVFtHghHmSmHkeM9rO5FENnDzf0AW0/4lme+Vv4IW+8ihbGczdN9ir9xSxF61toxWz7zHRvd3mbYaSxFfI1IncBdH/4Q8Lh9t0t0SsiUiP3oj3XGBhVUVuOcgYfR+N2eCRhD5uf9x+G3Ov5VhNGsb17Txo9RK3KjKrZC9E5hVJ/lagK7/Z1TXE7MWS7K9uHzvxj3fvo5k0h086SDC8sr/GO9w35cFvTsWSrvzZIquODFEei7iEl3fnM54xAw1g/vLT9TAj8Z1tBexKsLrsWbZ6j88TKPEHiOWMSiirrlYNdZ8tphdOAE9tP6PeFblA8d5vUU20Wwa9L1VrN/lNSbC6D6ENU4ijuryS6e5ssvjbA34h4L5E7N0JtF/aI8rpPJR2fTUvRV0Sod+L2cAICEYLSlrAJhKUzsSJxhGNx/+AdaKIHXD0R6u/Np6k= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:10 np0005539279 python3[4560]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:10 np0005539279 python3[4659]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:56:11 np0005539279 python3[4730]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764374170.5752451-207-73345551635300/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=aab34d4785b74545ba6dcdd725ab7da7_id_rsa follow=False checksum=2c14c4786a5c6afd9ab2e2c3432c64f45604b9ff backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:12 np0005539279 python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:56:12 np0005539279 python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764374171.552547-240-260357605909315/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=aab34d4785b74545ba6dcdd725ab7da7_id_rsa.pub follow=False checksum=3af6c65fa800201d619d7106ea5da06964eb4737 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:13 np0005539279 python3[4972]: ansible-ping Invoked with data=pong
Nov 28 18:56:14 np0005539279 python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 18:56:16 np0005539279 python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 28 18:56:17 np0005539279 python3[5086]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:17 np0005539279 python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:17 np0005539279 python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:18 np0005539279 python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:18 np0005539279 python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:18 np0005539279 python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:20 np0005539279 python3[5232]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:20 np0005539279 python3[5310]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:56:21 np0005539279 python3[5383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764374180.2799475-21-149918598487503/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:21 np0005539279 python3[5431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:22 np0005539279 python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:22 np0005539279 python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:22 np0005539279 python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:22 np0005539279 python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:23 np0005539279 python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:23 np0005539279 python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:23 np0005539279 python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:24 np0005539279 python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:24 np0005539279 python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:24 np0005539279 python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:25 np0005539279 python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:25 np0005539279 python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:25 np0005539279 python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:25 np0005539279 python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:26 np0005539279 python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:26 np0005539279 python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:26 np0005539279 python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:27 np0005539279 python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:27 np0005539279 python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:27 np0005539279 python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:28 np0005539279 python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:28 np0005539279 python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:28 np0005539279 python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:28 np0005539279 python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:29 np0005539279 python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 18:56:31 np0005539279 python3[6057]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 18:56:31 np0005539279 systemd[1]: Starting Time & Date Service...
Nov 28 18:56:31 np0005539279 systemd[1]: Started Time & Date Service.
Nov 28 18:56:31 np0005539279 systemd-timedated[6059]: Changed time zone to 'UTC' (UTC).
Nov 28 18:56:32 np0005539279 python3[6088]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:33 np0005539279 python3[6164]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:56:33 np0005539279 python3[6235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764374193.1941297-153-196385976898852/source _original_basename=tmpezrovosh follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:34 np0005539279 python3[6335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:56:34 np0005539279 python3[6406]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764374194.0928202-183-226096574704149/source _original_basename=tmp0o4ocd09 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:35 np0005539279 python3[6508]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:56:35 np0005539279 python3[6581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764374195.1108987-231-239416987451127/source _original_basename=tmp_3lu_ujk follow=False checksum=54ceff67f46a00e80734f8bde7b737fc4d565204 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:36 np0005539279 python3[6629]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 18:56:36 np0005539279 python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 18:56:37 np0005539279 python3[6735]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:56:37 np0005539279 python3[6808]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764374196.8195362-273-76813323377382/source _original_basename=tmp2etfznzh follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:38 np0005539279 python3[6859]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-525f-e06e-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 18:56:39 np0005539279 python3[6887]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-525f-e06e-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 28 18:56:40 np0005539279 python3[6915]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:56:58 np0005539279 python3[6941]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:57:01 np0005539279 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 28 18:57:35 np0005539279 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 28 18:57:35 np0005539279 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.1797] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 18:57:35 np0005539279 systemd-udevd[6945]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2069] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2099] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2102] device (eth1): carrier: link connected
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2105] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2111] policy: auto-activating connection 'Wired connection 1' (17a1339d-392e-3454-a32b-9b13aa44301c)
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2115] device (eth1): Activation: starting connection 'Wired connection 1' (17a1339d-392e-3454-a32b-9b13aa44301c)
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2117] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2119] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2124] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 18:57:35 np0005539279 NetworkManager[858]: <info>  [1764374255.2129] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 18:57:36 np0005539279 python3[6971]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-7cd2-cec2-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 18:57:43 np0005539279 python3[7051]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:57:43 np0005539279 python3[7124]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764374262.8017488-102-246092747508040/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=525478d1dd858f4be7c8e1853b6527738d52f794 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:57:44 np0005539279 python3[7174]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 18:57:44 np0005539279 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 28 18:57:44 np0005539279 systemd[1]: Stopped Network Manager Wait Online.
Nov 28 18:57:44 np0005539279 systemd[1]: Stopping Network Manager Wait Online...
Nov 28 18:57:44 np0005539279 NetworkManager[858]: <info>  [1764374264.4372] caught SIGTERM, shutting down normally.
Nov 28 18:57:44 np0005539279 systemd[1]: Stopping Network Manager...
Nov 28 18:57:44 np0005539279 NetworkManager[858]: <info>  [1764374264.4385] dhcp4 (eth0): canceled DHCP transaction
Nov 28 18:57:44 np0005539279 NetworkManager[858]: <info>  [1764374264.4385] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 18:57:44 np0005539279 NetworkManager[858]: <info>  [1764374264.4385] dhcp4 (eth0): state changed no lease
Nov 28 18:57:44 np0005539279 NetworkManager[858]: <info>  [1764374264.4389] manager: NetworkManager state is now CONNECTING
Nov 28 18:57:44 np0005539279 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 18:57:44 np0005539279 NetworkManager[858]: <info>  [1764374264.4655] dhcp4 (eth1): canceled DHCP transaction
Nov 28 18:57:44 np0005539279 NetworkManager[858]: <info>  [1764374264.4656] dhcp4 (eth1): state changed no lease
Nov 28 18:57:44 np0005539279 NetworkManager[858]: <info>  [1764374264.4731] exiting (success)
Nov 28 18:57:44 np0005539279 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 18:57:44 np0005539279 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 28 18:57:44 np0005539279 systemd[1]: Stopped Network Manager.
Nov 28 18:57:44 np0005539279 systemd[1]: NetworkManager.service: Consumed 1.307s CPU time, 10.2M memory peak.
Nov 28 18:57:44 np0005539279 systemd[1]: Starting Network Manager...
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.5530] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:5b517f8a-0dc6-4308-b13f-84fe445a9842)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.5535] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.5605] manager[0x556924b53070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 18:57:44 np0005539279 systemd[1]: Starting Hostname Service...
Nov 28 18:57:44 np0005539279 systemd[1]: Started Hostname Service.
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6828] hostname: hostname: using hostnamed
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6829] hostname: static hostname changed from (none) to "np0005539279.novalocal"
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6839] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6848] manager[0x556924b53070]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6848] manager[0x556924b53070]: rfkill: WWAN hardware radio set enabled
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6904] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6905] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6906] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6907] manager: Networking is enabled by state file
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6911] settings: Loaded settings plugin: keyfile (internal)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6918] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6966] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6986] dhcp: init: Using DHCP client 'internal'
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.6991] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7000] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7010] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7029] device (lo): Activation: starting connection 'lo' (abb2dd2c-7d4b-48ba-b333-c50c8b96e666)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7047] device (eth0): carrier: link connected
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7057] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7067] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7069] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7081] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7094] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7106] device (eth1): carrier: link connected
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7115] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7125] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (17a1339d-392e-3454-a32b-9b13aa44301c) (indicated)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7127] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7138] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7151] device (eth1): Activation: starting connection 'Wired connection 1' (17a1339d-392e-3454-a32b-9b13aa44301c)
Nov 28 18:57:44 np0005539279 systemd[1]: Started Network Manager.
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7162] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7169] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7173] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7177] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7181] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7187] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7191] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7196] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7202] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7214] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7220] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7237] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7243] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7269] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7279] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7290] device (lo): Activation: successful, device activated.
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7303] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7318] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 18:57:44 np0005539279 systemd[1]: Starting Network Manager Wait Online...
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7427] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7456] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7460] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7470] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7477] device (eth0): Activation: successful, device activated.
Nov 28 18:57:44 np0005539279 NetworkManager[7186]: <info>  [1764374264.7489] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 18:57:45 np0005539279 python3[7258]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-7cd2-cec2-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 18:57:54 np0005539279 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 18:58:14 np0005539279 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.2427] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 18:58:30 np0005539279 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 18:58:30 np0005539279 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.2831] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.2835] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.2853] device (eth1): Activation: successful, device activated.
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.2867] manager: startup complete
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.2870] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <warn>  [1764374310.2896] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.2911] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 28 18:58:30 np0005539279 systemd[1]: Finished Network Manager Wait Online.
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3095] dhcp4 (eth1): canceled DHCP transaction
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3096] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3096] dhcp4 (eth1): state changed no lease
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3122] policy: auto-activating connection 'ci-private-network' (f43141e8-2e9e-5be9-96d8-284c209fde47)
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3129] device (eth1): Activation: starting connection 'ci-private-network' (f43141e8-2e9e-5be9-96d8-284c209fde47)
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3131] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3135] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3146] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3160] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3221] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3224] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 18:58:30 np0005539279 NetworkManager[7186]: <info>  [1764374310.3233] device (eth1): Activation: successful, device activated.
Nov 28 18:58:40 np0005539279 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 18:58:41 np0005539279 systemd[4302]: Starting Mark boot as successful...
Nov 28 18:58:41 np0005539279 systemd[4302]: Finished Mark boot as successful.
Nov 28 18:58:45 np0005539279 systemd-logind[811]: Session 1 logged out. Waiting for processes to exit.
Nov 28 18:58:46 np0005539279 systemd-logind[811]: New session 3 of user zuul.
Nov 28 18:58:46 np0005539279 systemd[1]: Started Session 3 of User zuul.
Nov 28 18:58:47 np0005539279 python3[7368]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 18:58:47 np0005539279 python3[7441]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764374326.9165618-259-94673917832910/source _original_basename=tmpf_pei_1t follow=False checksum=78a4458192b635daff57e73d2bdc9db266d6510b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 18:58:49 np0005539279 systemd[1]: session-3.scope: Deactivated successfully.
Nov 28 18:58:49 np0005539279 systemd-logind[811]: Session 3 logged out. Waiting for processes to exit.
Nov 28 18:58:49 np0005539279 systemd-logind[811]: Removed session 3.
Nov 28 19:00:41 np0005539279 systemd[1]: Starting Rotate log files...
Nov 28 19:00:41 np0005539279 systemd[1]: logrotate.service: Deactivated successfully.
Nov 28 19:00:41 np0005539279 systemd[1]: Finished Rotate log files.
Nov 28 19:01:41 np0005539279 systemd[4302]: Created slice User Background Tasks Slice.
Nov 28 19:01:41 np0005539279 systemd[4302]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 19:01:41 np0005539279 systemd[4302]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 19:04:56 np0005539279 systemd-logind[811]: New session 4 of user zuul.
Nov 28 19:04:56 np0005539279 systemd[1]: Started Session 4 of User zuul.
Nov 28 19:04:57 np0005539279 python3[7523]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-5889-56aa-000000001cce-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:04:57 np0005539279 python3[7552]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:04:58 np0005539279 python3[7578]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:04:58 np0005539279 python3[7604]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:04:58 np0005539279 python3[7630]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:04:59 np0005539279 python3[7656]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:04:59 np0005539279 python3[7734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:05:00 np0005539279 python3[7807]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764374699.4378643-474-228401647997824/source _original_basename=tmpb6mu6o21 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:05:01 np0005539279 python3[7857]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:05:01 np0005539279 systemd[1]: Reloading.
Nov 28 19:05:01 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:05:02 np0005539279 python3[7913]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 28 19:05:03 np0005539279 python3[7939]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:05:03 np0005539279 python3[7967]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:05:03 np0005539279 python3[7995]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:05:04 np0005539279 python3[8023]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:05:04 np0005539279 python3[8050]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-5889-56aa-000000001cd5-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:05:05 np0005539279 python3[8080]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 19:05:07 np0005539279 systemd[1]: session-4.scope: Deactivated successfully.
Nov 28 19:05:07 np0005539279 systemd[1]: session-4.scope: Consumed 4.535s CPU time.
Nov 28 19:05:07 np0005539279 systemd-logind[811]: Session 4 logged out. Waiting for processes to exit.
Nov 28 19:05:07 np0005539279 systemd-logind[811]: Removed session 4.
Nov 28 19:05:09 np0005539279 systemd-logind[811]: New session 5 of user zuul.
Nov 28 19:05:09 np0005539279 systemd[1]: Started Session 5 of User zuul.
Nov 28 19:05:09 np0005539279 python3[8115]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 19:05:15 np0005539279 irqbalance[796]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 28 19:05:15 np0005539279 irqbalance[796]: IRQ 27 affinity is now unmanaged
Nov 28 19:05:23 np0005539279 kernel: SELinux:  Converting 390 SID table entries...
Nov 28 19:05:23 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:05:23 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:05:23 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:05:23 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:05:23 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:05:23 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:05:23 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:05:32 np0005539279 kernel: SELinux:  Converting 390 SID table entries...
Nov 28 19:05:32 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:05:32 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:05:32 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:05:32 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:05:32 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:05:32 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:05:32 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:05:42 np0005539279 kernel: SELinux:  Converting 390 SID table entries...
Nov 28 19:05:42 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:05:42 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:05:42 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:05:42 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:05:42 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:05:42 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:05:42 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:05:44 np0005539279 setsebool[8177]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 28 19:05:44 np0005539279 setsebool[8177]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 28 19:05:54 np0005539279 kernel: SELinux:  Converting 393 SID table entries...
Nov 28 19:05:54 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:05:54 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:05:54 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:05:54 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:05:54 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:05:54 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:05:54 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:06:17 np0005539279 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 28 19:06:17 np0005539279 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 19:06:17 np0005539279 systemd[1]: Starting man-db-cache-update.service...
Nov 28 19:06:17 np0005539279 systemd[1]: Reloading.
Nov 28 19:06:17 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:06:17 np0005539279 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 19:06:19 np0005539279 python3[10150]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-d1af-7d08-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:06:20 np0005539279 kernel: evm: overlay not supported
Nov 28 19:06:20 np0005539279 systemd[4302]: Starting D-Bus User Message Bus...
Nov 28 19:06:20 np0005539279 dbus-broker-launch[11402]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 28 19:06:20 np0005539279 dbus-broker-launch[11402]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 28 19:06:20 np0005539279 systemd[4302]: Started D-Bus User Message Bus.
Nov 28 19:06:20 np0005539279 dbus-broker-lau[11402]: Ready
Nov 28 19:06:20 np0005539279 systemd[4302]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 28 19:06:20 np0005539279 systemd[4302]: Created slice Slice /user.
Nov 28 19:06:20 np0005539279 systemd[4302]: podman-11125.scope: unit configures an IP firewall, but not running as root.
Nov 28 19:06:20 np0005539279 systemd[4302]: (This warning is only shown for the first unit using IP firewalling.)
Nov 28 19:06:20 np0005539279 systemd[4302]: Started podman-11125.scope.
Nov 28 19:06:21 np0005539279 systemd[4302]: Started podman-pause-01a3d4d5.scope.
Nov 28 19:06:21 np0005539279 python3[11994]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.44:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.44:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:06:21 np0005539279 python3[11994]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 28 19:06:21 np0005539279 systemd[1]: session-5.scope: Deactivated successfully.
Nov 28 19:06:21 np0005539279 systemd[1]: session-5.scope: Consumed 1min 259ms CPU time.
Nov 28 19:06:21 np0005539279 systemd-logind[811]: Session 5 logged out. Waiting for processes to exit.
Nov 28 19:06:21 np0005539279 systemd-logind[811]: Removed session 5.
Nov 28 19:06:45 np0005539279 systemd-logind[811]: New session 6 of user zuul.
Nov 28 19:06:45 np0005539279 systemd[1]: Started Session 6 of User zuul.
Nov 28 19:06:45 np0005539279 python3[23304]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC71GoT37hIC7cHQWCqHeKXhRhKWw4qc/IMtNGgGQFLX0bCLEU19QzFKxL8CFAmYq2t2/+NB8gny3QTKVcqAVfg= zuul@np0005539278.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 19:06:46 np0005539279 python3[23470]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC71GoT37hIC7cHQWCqHeKXhRhKWw4qc/IMtNGgGQFLX0bCLEU19QzFKxL8CFAmYq2t2/+NB8gny3QTKVcqAVfg= zuul@np0005539278.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 19:06:47 np0005539279 python3[23775]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005539279.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 19:06:47 np0005539279 python3[24005]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBC71GoT37hIC7cHQWCqHeKXhRhKWw4qc/IMtNGgGQFLX0bCLEU19QzFKxL8CFAmYq2t2/+NB8gny3QTKVcqAVfg= zuul@np0005539278.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 19:06:48 np0005539279 python3[24275]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:06:48 np0005539279 python3[24591]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764374807.796381-135-6058579200474/source _original_basename=tmpkfqejruc follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:06:49 np0005539279 python3[24926]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 28 19:06:49 np0005539279 systemd[1]: Starting Hostname Service...
Nov 28 19:06:49 np0005539279 systemd[1]: Started Hostname Service.
Nov 28 19:06:49 np0005539279 systemd-hostnamed[25051]: Changed pretty hostname to 'compute-0'
Nov 28 19:06:49 np0005539279 systemd-hostnamed[25051]: Hostname set to <compute-0> (static)
Nov 28 19:06:49 np0005539279 NetworkManager[7186]: <info>  [1764374809.4946] hostname: static hostname changed from "np0005539279.novalocal" to "compute-0"
Nov 28 19:06:49 np0005539279 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 19:06:49 np0005539279 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 19:06:49 np0005539279 systemd[1]: session-6.scope: Deactivated successfully.
Nov 28 19:06:49 np0005539279 systemd[1]: session-6.scope: Consumed 2.381s CPU time.
Nov 28 19:06:49 np0005539279 systemd-logind[811]: Session 6 logged out. Waiting for processes to exit.
Nov 28 19:06:49 np0005539279 systemd-logind[811]: Removed session 6.
Nov 28 19:06:59 np0005539279 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 19:07:02 np0005539279 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 19:07:02 np0005539279 systemd[1]: Finished man-db-cache-update.service.
Nov 28 19:07:02 np0005539279 systemd[1]: man-db-cache-update.service: Consumed 53.854s CPU time.
Nov 28 19:07:02 np0005539279 systemd[1]: run-rd91646b1aeb24a56aa30b909df9c6e54.service: Deactivated successfully.
Nov 28 19:07:19 np0005539279 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 19:09:41 np0005539279 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 28 19:09:41 np0005539279 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 28 19:09:41 np0005539279 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 28 19:09:41 np0005539279 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 28 19:11:38 np0005539279 systemd-logind[811]: New session 7 of user zuul.
Nov 28 19:11:38 np0005539279 systemd[1]: Started Session 7 of User zuul.
Nov 28 19:11:39 np0005539279 python3[30020]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:11:41 np0005539279 python3[30137]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:11:42 np0005539279 python3[30210]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764375101.371978-33552-263008136957930/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:11:42 np0005539279 python3[30236]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:11:42 np0005539279 python3[30309]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764375101.371978-33552-263008136957930/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:11:43 np0005539279 python3[30335]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:11:43 np0005539279 python3[30408]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764375101.371978-33552-263008136957930/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:11:43 np0005539279 python3[30434]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:11:44 np0005539279 python3[30507]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764375101.371978-33552-263008136957930/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:11:44 np0005539279 python3[30534]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:11:44 np0005539279 python3[30607]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764375101.371978-33552-263008136957930/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:11:44 np0005539279 python3[30633]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:11:45 np0005539279 python3[30706]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764375101.371978-33552-263008136957930/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:11:45 np0005539279 python3[30732]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 19:11:45 np0005539279 python3[30805]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764375101.371978-33552-263008136957930/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:11:57 np0005539279 python3[30863]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:16:56 np0005539279 systemd[1]: session-7.scope: Deactivated successfully.
Nov 28 19:16:56 np0005539279 systemd[1]: session-7.scope: Consumed 5.104s CPU time.
Nov 28 19:16:56 np0005539279 systemd-logind[811]: Session 7 logged out. Waiting for processes to exit.
Nov 28 19:16:56 np0005539279 systemd-logind[811]: Removed session 7.
Nov 28 19:27:31 np0005539279 systemd-logind[811]: New session 8 of user zuul.
Nov 28 19:27:31 np0005539279 systemd[1]: Started Session 8 of User zuul.
Nov 28 19:27:32 np0005539279 python3.9[31111]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:27:34 np0005539279 python3.9[31292]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:27:41 np0005539279 systemd[1]: session-8.scope: Deactivated successfully.
Nov 28 19:27:41 np0005539279 systemd[1]: session-8.scope: Consumed 8.248s CPU time.
Nov 28 19:27:41 np0005539279 systemd-logind[811]: Session 8 logged out. Waiting for processes to exit.
Nov 28 19:27:41 np0005539279 systemd-logind[811]: Removed session 8.
Nov 28 19:27:47 np0005539279 systemd-logind[811]: New session 9 of user zuul.
Nov 28 19:27:47 np0005539279 systemd[1]: Started Session 9 of User zuul.
Nov 28 19:27:48 np0005539279 python3.9[31503]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:27:48 np0005539279 systemd[1]: session-9.scope: Deactivated successfully.
Nov 28 19:27:48 np0005539279 systemd-logind[811]: Session 9 logged out. Waiting for processes to exit.
Nov 28 19:27:48 np0005539279 systemd-logind[811]: Removed session 9.
Nov 28 19:28:04 np0005539279 systemd-logind[811]: New session 10 of user zuul.
Nov 28 19:28:04 np0005539279 systemd[1]: Started Session 10 of User zuul.
Nov 28 19:28:05 np0005539279 python3.9[31686]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 28 19:28:06 np0005539279 python3.9[31860]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:28:07 np0005539279 python3.9[32012]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:28:08 np0005539279 python3.9[32165]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:28:09 np0005539279 python3.9[32317]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:28:10 np0005539279 python3.9[32469]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:28:11 np0005539279 python3.9[32592]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376089.8648467-73-50696127089508/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:28:12 np0005539279 python3.9[32744]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:28:13 np0005539279 python3.9[32901]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:28:14 np0005539279 python3.9[33053]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:28:14 np0005539279 python3.9[33203]: ansible-ansible.builtin.service_facts Invoked
Nov 28 19:28:19 np0005539279 python3.9[33458]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:28:19 np0005539279 python3.9[33608]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:28:21 np0005539279 python3.9[33762]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:28:22 np0005539279 python3.9[33920]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:28:23 np0005539279 python3.9[34004]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:29:10 np0005539279 systemd[1]: Reloading.
Nov 28 19:29:10 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:29:10 np0005539279 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 28 19:29:10 np0005539279 systemd[1]: Reloading.
Nov 28 19:29:10 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:29:10 np0005539279 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 28 19:29:11 np0005539279 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 28 19:29:11 np0005539279 systemd[1]: Reloading.
Nov 28 19:29:11 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:29:11 np0005539279 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 28 19:29:11 np0005539279 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Nov 28 19:29:11 np0005539279 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Nov 28 19:29:11 np0005539279 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Nov 28 19:30:21 np0005539279 kernel: SELinux:  Converting 2717 SID table entries...
Nov 28 19:30:21 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:30:21 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:30:21 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:30:21 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:30:21 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:30:21 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:30:21 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:30:21 np0005539279 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 28 19:30:21 np0005539279 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 19:30:21 np0005539279 systemd[1]: Starting man-db-cache-update.service...
Nov 28 19:30:21 np0005539279 systemd[1]: Reloading.
Nov 28 19:30:21 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:30:21 np0005539279 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 19:30:22 np0005539279 python3.9[35538]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:30:24 np0005539279 python3.9[35819]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 28 19:30:25 np0005539279 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 19:30:25 np0005539279 systemd[1]: Finished man-db-cache-update.service.
Nov 28 19:30:25 np0005539279 systemd[1]: man-db-cache-update.service: Consumed 1.359s CPU time.
Nov 28 19:30:25 np0005539279 systemd[1]: run-r2db409aeb1c54fab91971707cc87adf4.service: Deactivated successfully.
Nov 28 19:30:25 np0005539279 python3.9[35973]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 28 19:30:28 np0005539279 python3.9[36128]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:30:29 np0005539279 python3.9[36280]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 28 19:30:30 np0005539279 python3.9[36432]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:30:31 np0005539279 python3.9[36584]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:30:31 np0005539279 python3.9[36707]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376230.8539822-236-68423098877681/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=d91c19d3d5949c7bafe97146b80f0a711185a1cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:30:35 np0005539279 python3.9[36859]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:30:41 np0005539279 python3.9[37011]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:30:42 np0005539279 python3.9[37164]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:30:43 np0005539279 python3.9[37316]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 28 19:30:43 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 19:30:44 np0005539279 python3.9[37470]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 19:30:45 np0005539279 python3.9[37628]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 19:30:46 np0005539279 python3.9[37788]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 28 19:30:47 np0005539279 python3.9[37943]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 19:30:49 np0005539279 python3.9[38101]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 28 19:30:50 np0005539279 python3.9[38253]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:30:53 np0005539279 python3.9[38406]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:30:54 np0005539279 python3.9[38558]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:30:55 np0005539279 python3.9[38681]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376253.9174628-355-235927529369673/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:30:56 np0005539279 python3.9[38835]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:30:56 np0005539279 systemd[1]: Starting Load Kernel Modules...
Nov 28 19:30:56 np0005539279 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 28 19:30:56 np0005539279 kernel: Bridge firewalling registered
Nov 28 19:30:56 np0005539279 systemd-modules-load[38839]: Inserted module 'br_netfilter'
Nov 28 19:30:56 np0005539279 systemd[1]: Finished Load Kernel Modules.
Nov 28 19:30:57 np0005539279 python3.9[38994]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:30:57 np0005539279 python3.9[39119]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376256.7503362-378-183032760592517/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:30:58 np0005539279 python3.9[39271]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:31:02 np0005539279 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Nov 28 19:31:02 np0005539279 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Nov 28 19:31:02 np0005539279 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 19:31:02 np0005539279 systemd[1]: Starting man-db-cache-update.service...
Nov 28 19:31:02 np0005539279 systemd[1]: Reloading.
Nov 28 19:31:02 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:31:02 np0005539279 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 19:31:04 np0005539279 python3.9[40580]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:31:04 np0005539279 python3.9[41471]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 28 19:31:05 np0005539279 python3.9[42119]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:31:06 np0005539279 python3.9[42937]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:31:06 np0005539279 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 19:31:06 np0005539279 systemd[1]: Starting Authorization Manager...
Nov 28 19:31:06 np0005539279 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 19:31:06 np0005539279 systemd[1]: Finished man-db-cache-update.service.
Nov 28 19:31:06 np0005539279 systemd[1]: man-db-cache-update.service: Consumed 5.138s CPU time.
Nov 28 19:31:06 np0005539279 systemd[1]: run-r07a1d9f497494835846b917069607f90.service: Deactivated successfully.
Nov 28 19:31:06 np0005539279 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 19:31:06 np0005539279 polkitd[43682]: Started polkitd version 0.117
Nov 28 19:31:06 np0005539279 systemd[1]: Started Authorization Manager.
Nov 28 19:31:07 np0005539279 python3.9[43853]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:31:07 np0005539279 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 19:31:07 np0005539279 systemd[1]: tuned.service: Deactivated successfully.
Nov 28 19:31:07 np0005539279 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 19:31:07 np0005539279 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 19:31:08 np0005539279 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 19:31:08 np0005539279 python3.9[44015]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 28 19:31:11 np0005539279 python3.9[44167]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:31:11 np0005539279 systemd[1]: Reloading.
Nov 28 19:31:11 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:31:11 np0005539279 systemd[1]: Starting dnf makecache...
Nov 28 19:31:11 np0005539279 dnf[44205]: Failed determining last makecache time.
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-openstack-barbican-42b4c41831408a8e323 104 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 159 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-openstack-cinder-1c00d6490d88e436f26ef 125 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-python-stevedore-c4acc5639fd2329372142 147 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-python-cloudkitty-tests-tempest-2c80f8 152 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-os-net-config-9758ab42364673d01bc5014e 152 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 135 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-python-designate-tests-tempest-347fdbc 165 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-openstack-glance-1fd12c29b339f30fe823e 146 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 132 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-openstack-manila-3c01b7181572c95dac462 147 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-python-whitebox-neutron-tests-tempest- 139 kB/s | 3.0 kB     00:00
Nov 28 19:31:11 np0005539279 dnf[44205]: delorean-openstack-octavia-ba397f07a7331190208c 139 kB/s | 3.0 kB     00:00
Nov 28 19:31:12 np0005539279 dnf[44205]: delorean-openstack-watcher-c014f81a8647287f6dcc 150 kB/s | 3.0 kB     00:00
Nov 28 19:31:12 np0005539279 dnf[44205]: delorean-python-tcib-1124124ec06aadbac34f0d340b 149 kB/s | 3.0 kB     00:00
Nov 28 19:31:12 np0005539279 dnf[44205]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 159 kB/s | 3.0 kB     00:00
Nov 28 19:31:12 np0005539279 dnf[44205]: delorean-openstack-swift-dc98a8463506ac520c469a 162 kB/s | 3.0 kB     00:00
Nov 28 19:31:12 np0005539279 dnf[44205]: delorean-python-tempestconf-8515371b7cceebd4282 155 kB/s | 3.0 kB     00:00
Nov 28 19:31:12 np0005539279 dnf[44205]: delorean-openstack-heat-ui-013accbfd179753bc3f0 152 kB/s | 3.0 kB     00:00
Nov 28 19:31:12 np0005539279 dnf[44205]: CentOS Stream 9 - BaseOS                         76 kB/s | 7.3 kB     00:00
Nov 28 19:31:12 np0005539279 python3.9[44370]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:31:12 np0005539279 systemd[1]: Reloading.
Nov 28 19:31:12 np0005539279 dnf[44205]: CentOS Stream 9 - AppStream                      32 kB/s | 7.4 kB     00:00
Nov 28 19:31:12 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:31:12 np0005539279 dnf[44205]: CentOS Stream 9 - CRB                            25 kB/s | 7.2 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: CentOS Stream 9 - Extras packages                85 kB/s | 8.3 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: dlrn-antelope-testing                            89 kB/s | 3.0 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: dlrn-antelope-build-deps                        129 kB/s | 3.0 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: centos9-rabbitmq                                 34 kB/s | 3.0 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: centos9-storage                                  36 kB/s | 3.0 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: centos9-opstools                                 84 kB/s | 3.0 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: NFV SIG OpenvSwitch                             116 kB/s | 3.0 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: repo-setup-centos-appstream                     183 kB/s | 4.4 kB     00:00
Nov 28 19:31:13 np0005539279 python3.9[44580]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:31:13 np0005539279 dnf[44205]: repo-setup-centos-baseos                         91 kB/s | 3.9 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: repo-setup-centos-highavailability              158 kB/s | 3.9 kB     00:00
Nov 28 19:31:13 np0005539279 dnf[44205]: repo-setup-centos-powertools                     26 kB/s | 4.3 kB     00:00
Nov 28 19:31:14 np0005539279 dnf[44205]: Extra Packages for Enterprise Linux 9 - x86_64  263 kB/s |  34 kB     00:00
Nov 28 19:31:14 np0005539279 python3.9[44742]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:31:14 np0005539279 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 28 19:31:14 np0005539279 dnf[44205]: Metadata cache created.
Nov 28 19:31:14 np0005539279 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 19:31:14 np0005539279 systemd[1]: Finished dnf makecache.
Nov 28 19:31:14 np0005539279 systemd[1]: dnf-makecache.service: Consumed 1.917s CPU time.
Nov 28 19:31:15 np0005539279 python3.9[44895]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:31:15 np0005539279 irqbalance[796]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 28 19:31:15 np0005539279 irqbalance[796]: IRQ 26 affinity is now unmanaged
Nov 28 19:31:17 np0005539279 python3.9[45057]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:31:18 np0005539279 python3.9[45210]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:31:18 np0005539279 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 19:31:18 np0005539279 systemd[1]: Stopped Apply Kernel Variables.
Nov 28 19:31:18 np0005539279 systemd[1]: Stopping Apply Kernel Variables...
Nov 28 19:31:18 np0005539279 systemd[1]: Starting Apply Kernel Variables...
Nov 28 19:31:18 np0005539279 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 19:31:18 np0005539279 systemd[1]: Finished Apply Kernel Variables.
Nov 28 19:31:18 np0005539279 systemd[1]: session-10.scope: Deactivated successfully.
Nov 28 19:31:18 np0005539279 systemd[1]: session-10.scope: Consumed 2min 21.721s CPU time.
Nov 28 19:31:18 np0005539279 systemd-logind[811]: Session 10 logged out. Waiting for processes to exit.
Nov 28 19:31:18 np0005539279 systemd-logind[811]: Removed session 10.
Nov 28 19:31:24 np0005539279 systemd-logind[811]: New session 11 of user zuul.
Nov 28 19:31:24 np0005539279 systemd[1]: Started Session 11 of User zuul.
Nov 28 19:31:26 np0005539279 python3.9[45393]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:31:27 np0005539279 python3.9[45547]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:31:28 np0005539279 python3.9[45705]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:31:29 np0005539279 python3.9[45856]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:31:30 np0005539279 python3.9[46012]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:31:31 np0005539279 python3.9[46096]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:31:33 np0005539279 python3.9[46249]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:31:34 np0005539279 python3.9[46420]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:31:35 np0005539279 python3.9[46572]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:31:35 np0005539279 podman[46573]: 2025-11-29 00:31:35.658765714 +0000 UTC m=+0.045034971 system refresh
Nov 28 19:31:36 np0005539279 python3.9[46735]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:31:36 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:31:37 np0005539279 python3.9[46858]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376295.8913949-109-9929183876022/.source.json follow=False _original_basename=podman_network_config.j2 checksum=382275f72f30a7b52c5415abef30d3f3809ac4de backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:31:38 np0005539279 python3.9[47010]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:31:38 np0005539279 python3.9[47133]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376297.7214093-124-165988864712537/.source.conf follow=False _original_basename=registries.conf.j2 checksum=ea7e71ddf075bf55e555c64399d15b2ffe005fe9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:31:39 np0005539279 python3.9[47287]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:31:40 np0005539279 python3.9[47439]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:31:41 np0005539279 python3.9[47591]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:31:42 np0005539279 python3.9[47743]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:31:43 np0005539279 python3.9[47893]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:31:44 np0005539279 python3.9[48047]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:31:46 np0005539279 python3.9[48200]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:31:48 np0005539279 python3.9[48360]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:31:50 np0005539279 python3.9[48513]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:31:53 np0005539279 python3.9[48666]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:31:55 np0005539279 python3.9[48822]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:31:59 np0005539279 python3.9[48992]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:32:01 np0005539279 python3.9[49149]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:32:17 np0005539279 python3.9[49492]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:32:19 np0005539279 python3.9[49648]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:32:20 np0005539279 python3.9[49823]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:32:20 np0005539279 python3.9[49948]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764376339.5198665-272-97549562017605/.source.json _original_basename=.14pjgdrr follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:32:22 np0005539279 python3.9[50100]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 19:32:22 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:24 np0005539279 systemd[1]: var-lib-containers-storage-overlay-compat2642944845-lower\x2dmapped.mount: Deactivated successfully.
Nov 28 19:32:28 np0005539279 podman[50112]: 2025-11-29 00:32:28.233899725 +0000 UTC m=+6.036733455 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 19:32:28 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:28 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:28 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:29 np0005539279 python3.9[50408]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 19:32:29 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:41 np0005539279 podman[50419]: 2025-11-29 00:32:41.533403438 +0000 UTC m=+12.005245169 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:32:41 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:41 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:41 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:42 np0005539279 python3.9[50720]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 19:32:42 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:44 np0005539279 podman[50733]: 2025-11-29 00:32:44.090360452 +0000 UTC m=+1.371622436 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 19:32:44 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:44 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:44 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:45 np0005539279 python3.9[50968]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 19:32:45 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:57 np0005539279 podman[50980]: 2025-11-29 00:32:57.741655825 +0000 UTC m=+12.482794784 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 19:32:57 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:57 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:57 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:32:59 np0005539279 python3.9[51254]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 19:32:59 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:33:05 np0005539279 podman[51265]: 2025-11-29 00:33:05.583147085 +0000 UTC m=+6.417032334 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 19:33:05 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:33:05 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:33:05 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:33:06 np0005539279 python3.9[51522]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 19:33:06 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:33:08 np0005539279 podman[51535]: 2025-11-29 00:33:08.043942846 +0000 UTC m=+1.376086684 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 28 19:33:08 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:33:08 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:33:08 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:33:08 np0005539279 systemd[1]: session-11.scope: Deactivated successfully.
Nov 28 19:33:08 np0005539279 systemd[1]: session-11.scope: Consumed 1min 58.557s CPU time.
Nov 28 19:33:08 np0005539279 systemd-logind[811]: Session 11 logged out. Waiting for processes to exit.
Nov 28 19:33:08 np0005539279 systemd-logind[811]: Removed session 11.
Nov 28 19:33:13 np0005539279 systemd-logind[811]: New session 12 of user zuul.
Nov 28 19:33:13 np0005539279 systemd[1]: Started Session 12 of User zuul.
Nov 28 19:33:14 np0005539279 python3.9[51839]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:33:16 np0005539279 python3.9[51995]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 28 19:33:17 np0005539279 python3.9[52148]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 19:33:18 np0005539279 python3.9[52306]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 19:33:19 np0005539279 python3.9[52466]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:33:20 np0005539279 python3.9[52550]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:33:22 np0005539279 python3.9[52715]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:33:37 np0005539279 kernel: SELinux:  Converting 2731 SID table entries...
Nov 28 19:33:37 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:33:37 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:33:37 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:33:37 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:33:37 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:33:37 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:33:37 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:33:37 np0005539279 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 28 19:33:37 np0005539279 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 28 19:33:38 np0005539279 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 19:33:39 np0005539279 systemd[1]: Starting man-db-cache-update.service...
Nov 28 19:33:39 np0005539279 systemd[1]: Reloading.
Nov 28 19:33:39 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:33:39 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:33:39 np0005539279 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 19:33:40 np0005539279 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 19:33:40 np0005539279 systemd[1]: Finished man-db-cache-update.service.
Nov 28 19:33:40 np0005539279 systemd[1]: man-db-cache-update.service: Consumed 1.129s CPU time.
Nov 28 19:33:40 np0005539279 systemd[1]: run-r22ad54b5676040cc8bc699e7550f6f29.service: Deactivated successfully.
Nov 28 19:33:41 np0005539279 python3.9[53818]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 19:33:41 np0005539279 systemd[1]: Reloading.
Nov 28 19:33:41 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:33:41 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:33:41 np0005539279 systemd[1]: Starting Open vSwitch Database Unit...
Nov 28 19:33:41 np0005539279 chown[53859]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 28 19:33:41 np0005539279 ovs-ctl[53864]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 28 19:33:41 np0005539279 ovs-ctl[53864]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 28 19:33:41 np0005539279 ovs-ctl[53864]: Starting ovsdb-server [  OK  ]
Nov 28 19:33:41 np0005539279 ovs-vsctl[53913]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 28 19:33:42 np0005539279 ovs-vsctl[53929]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"bb6a090d-c99b-4a6a-9b20-ad4330625b75\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 28 19:33:42 np0005539279 ovs-ctl[53864]: Configuring Open vSwitch system IDs [  OK  ]
Nov 28 19:33:42 np0005539279 ovs-vsctl[53939]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 28 19:33:42 np0005539279 ovs-ctl[53864]: Enabling remote OVSDB managers [  OK  ]
Nov 28 19:33:42 np0005539279 systemd[1]: Started Open vSwitch Database Unit.
Nov 28 19:33:42 np0005539279 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 28 19:33:42 np0005539279 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 28 19:33:42 np0005539279 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 28 19:33:42 np0005539279 kernel: openvswitch: Open vSwitch switching datapath
Nov 28 19:33:42 np0005539279 ovs-ctl[53983]: Inserting openvswitch module [  OK  ]
Nov 28 19:33:42 np0005539279 ovs-ctl[53952]: Starting ovs-vswitchd [  OK  ]
Nov 28 19:33:42 np0005539279 ovs-vsctl[54000]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 28 19:33:42 np0005539279 ovs-ctl[53952]: Enabling remote OVSDB managers [  OK  ]
Nov 28 19:33:42 np0005539279 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 28 19:33:42 np0005539279 systemd[1]: Starting Open vSwitch...
Nov 28 19:33:42 np0005539279 systemd[1]: Finished Open vSwitch.
Nov 28 19:33:43 np0005539279 python3.9[54152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:33:44 np0005539279 python3.9[54304]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 28 19:33:46 np0005539279 kernel: SELinux:  Converting 2745 SID table entries...
Nov 28 19:33:46 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:33:46 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:33:46 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:33:46 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:33:46 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:33:46 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:33:46 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:33:47 np0005539279 python3.9[54465]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:33:48 np0005539279 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 28 19:33:48 np0005539279 python3.9[54623]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:33:50 np0005539279 python3.9[54776]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:33:52 np0005539279 python3.9[55065]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 19:33:53 np0005539279 python3.9[55215]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:33:54 np0005539279 python3.9[55369]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:33:56 np0005539279 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 19:33:56 np0005539279 systemd[1]: Starting man-db-cache-update.service...
Nov 28 19:33:56 np0005539279 systemd[1]: Reloading.
Nov 28 19:33:56 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:33:56 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:33:56 np0005539279 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 19:33:57 np0005539279 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 19:33:57 np0005539279 systemd[1]: Finished man-db-cache-update.service.
Nov 28 19:33:57 np0005539279 systemd[1]: run-r4d4818edb113436f9af2de731e8c6cad.service: Deactivated successfully.
Nov 28 19:33:58 np0005539279 python3.9[55690]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:33:58 np0005539279 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 28 19:33:58 np0005539279 systemd[1]: Stopped Network Manager Wait Online.
Nov 28 19:33:58 np0005539279 systemd[1]: Stopping Network Manager Wait Online...
Nov 28 19:33:58 np0005539279 systemd[1]: Stopping Network Manager...
Nov 28 19:33:58 np0005539279 NetworkManager[7186]: <info>  [1764376438.1686] caught SIGTERM, shutting down normally.
Nov 28 19:33:58 np0005539279 NetworkManager[7186]: <info>  [1764376438.1704] dhcp4 (eth0): canceled DHCP transaction
Nov 28 19:33:58 np0005539279 NetworkManager[7186]: <info>  [1764376438.1705] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 19:33:58 np0005539279 NetworkManager[7186]: <info>  [1764376438.1705] dhcp4 (eth0): state changed no lease
Nov 28 19:33:58 np0005539279 NetworkManager[7186]: <info>  [1764376438.1707] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 19:33:58 np0005539279 NetworkManager[7186]: <info>  [1764376438.1775] exiting (success)
Nov 28 19:33:58 np0005539279 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 19:33:58 np0005539279 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 19:33:58 np0005539279 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 28 19:33:58 np0005539279 systemd[1]: Stopped Network Manager.
Nov 28 19:33:58 np0005539279 systemd[1]: NetworkManager.service: Consumed 15.040s CPU time, 4.1M memory peak, read 0B from disk, written 16.0K to disk.
Nov 28 19:33:58 np0005539279 systemd[1]: Starting Network Manager...
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.2446] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:5b517f8a-0dc6-4308-b13f-84fe445a9842)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.2450] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.2508] manager[0x55812ddbd090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 19:33:58 np0005539279 systemd[1]: Starting Hostname Service...
Nov 28 19:33:58 np0005539279 systemd[1]: Started Hostname Service.
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3380] hostname: hostname: using hostnamed
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3381] hostname: static hostname changed from (none) to "compute-0"
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3386] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3390] manager[0x55812ddbd090]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3390] manager[0x55812ddbd090]: rfkill: WWAN hardware radio set enabled
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3411] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3421] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3421] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3422] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3422] manager: Networking is enabled by state file
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3425] settings: Loaded settings plugin: keyfile (internal)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3428] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3451] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3460] dhcp: init: Using DHCP client 'internal'
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3462] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3467] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3472] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3478] device (lo): Activation: starting connection 'lo' (abb2dd2c-7d4b-48ba-b333-c50c8b96e666)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3484] device (eth0): carrier: link connected
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3487] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3490] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3491] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3496] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3501] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3506] device (eth1): carrier: link connected
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3509] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3512] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f43141e8-2e9e-5be9-96d8-284c209fde47) (indicated)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3513] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3518] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3524] device (eth1): Activation: starting connection 'ci-private-network' (f43141e8-2e9e-5be9-96d8-284c209fde47)
Nov 28 19:33:58 np0005539279 systemd[1]: Started Network Manager.
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3531] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3547] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3549] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3550] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3551] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3553] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3555] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3557] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3559] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3564] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3566] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3573] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3582] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3601] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3606] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3673] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3682] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3686] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3688] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3694] device (lo): Activation: successful, device activated.
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3702] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3706] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3710] device (eth1): Activation: successful, device activated.
Nov 28 19:33:58 np0005539279 systemd[1]: Starting Network Manager Wait Online...
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3731] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3732] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3737] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3741] device (eth0): Activation: successful, device activated.
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3747] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 19:33:58 np0005539279 NetworkManager[55703]: <info>  [1764376438.3751] manager: startup complete
Nov 28 19:33:58 np0005539279 systemd[1]: Finished Network Manager Wait Online.
Nov 28 19:33:59 np0005539279 python3.9[55916]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:34:03 np0005539279 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 19:34:04 np0005539279 systemd[1]: Starting man-db-cache-update.service...
Nov 28 19:34:04 np0005539279 systemd[1]: Reloading.
Nov 28 19:34:04 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:34:04 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:34:04 np0005539279 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 19:34:05 np0005539279 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 19:34:05 np0005539279 systemd[1]: Finished man-db-cache-update.service.
Nov 28 19:34:05 np0005539279 systemd[1]: run-r0bc403d7d95d4aa3aff08aa577b70f66.service: Deactivated successfully.
Nov 28 19:34:06 np0005539279 python3.9[56373]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:34:07 np0005539279 python3.9[56525]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:07 np0005539279 python3.9[56679]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:08 np0005539279 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 19:34:08 np0005539279 python3.9[56831]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:09 np0005539279 python3.9[56983]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:10 np0005539279 python3.9[57135]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:10 np0005539279 python3.9[57287]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:34:11 np0005539279 python3.9[57410]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376450.2491195-229-104667617395860/.source _original_basename=.x7j2losu follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:12 np0005539279 python3.9[57562]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:13 np0005539279 python3.9[57714]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 28 19:34:14 np0005539279 python3.9[57866]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:16 np0005539279 python3.9[58293]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 28 19:34:18 np0005539279 ansible-async_wrapper.py[58468]: Invoked with j223799640406 300 /home/zuul/.ansible/tmp/ansible-tmp-1764376457.250488-295-112481090941067/AnsiballZ_edpm_os_net_config.py _
Nov 28 19:34:18 np0005539279 ansible-async_wrapper.py[58471]: Starting module and watcher
Nov 28 19:34:18 np0005539279 ansible-async_wrapper.py[58471]: Start watching 58472 (300)
Nov 28 19:34:18 np0005539279 ansible-async_wrapper.py[58472]: Start module (58472)
Nov 28 19:34:18 np0005539279 ansible-async_wrapper.py[58468]: Return async_wrapper task started.
Nov 28 19:34:18 np0005539279 python3.9[58473]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 28 19:34:18 np0005539279 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 28 19:34:18 np0005539279 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 28 19:34:18 np0005539279 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 28 19:34:18 np0005539279 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 28 19:34:18 np0005539279 kernel: cfg80211: failed to load regulatory.db
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.3848] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.3875] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4443] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4446] audit: op="connection-add" uuid="70e768cf-87f0-421b-92ef-f50c421de218" name="br-ex-br" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4466] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4468] audit: op="connection-add" uuid="83b727e0-23cd-4aab-862f-756ba9609a74" name="br-ex-port" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4484] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4486] audit: op="connection-add" uuid="889ea019-a45d-4e60-a4fa-266fe43537e7" name="eth1-port" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4504] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4507] audit: op="connection-add" uuid="f1538cf4-c8aa-4944-9d55-9d875eb52179" name="vlan20-port" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4522] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4526] audit: op="connection-add" uuid="a618f892-1f2d-4529-b4f0-3db0e1e90117" name="vlan21-port" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4541] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4544] audit: op="connection-add" uuid="ba0fcddc-bce8-43fd-9557-afc904b7684d" name="vlan22-port" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4571] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4594] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.4597] audit: op="connection-add" uuid="4e15af57-9404-42a2-9854-390a4ac53d3f" name="br-ex-if" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5219] audit: op="connection-update" uuid="f43141e8-2e9e-5be9-96d8-284c209fde47" name="ci-private-network" args="ovs-interface.type,ovs-external-ids.data,connection.timestamp,connection.controller,connection.master,connection.port-type,connection.slave-type,ipv4.routing-rules,ipv4.addresses,ipv4.never-default,ipv4.method,ipv4.dns,ipv4.routes,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.addresses,ipv6.routes,ipv6.method,ipv6.dns" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5257] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5262] audit: op="connection-add" uuid="311a043c-81fc-4daf-afc6-0462c489e96d" name="vlan20-if" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5296] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5300] audit: op="connection-add" uuid="39b6f9c8-5e5e-4ac4-b1bb-1eafad15b332" name="vlan21-if" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5333] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5338] audit: op="connection-add" uuid="7b47f90b-ad3b-4558-bea8-6d01988d22b5" name="vlan22-if" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5359] audit: op="connection-delete" uuid="17a1339d-392e-3454-a32b-9b13aa44301c" name="Wired connection 1" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5382] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5401] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5410] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (70e768cf-87f0-421b-92ef-f50c421de218)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5412] audit: op="connection-activate" uuid="70e768cf-87f0-421b-92ef-f50c421de218" name="br-ex-br" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5417] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5431] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5439] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (83b727e0-23cd-4aab-862f-756ba9609a74)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5445] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5457] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5469] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (889ea019-a45d-4e60-a4fa-266fe43537e7)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5475] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5490] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5500] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (f1538cf4-c8aa-4944-9d55-9d875eb52179)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5505] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5519] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5529] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (a618f892-1f2d-4529-b4f0-3db0e1e90117)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5533] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5549] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5560] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (ba0fcddc-bce8-43fd-9557-afc904b7684d)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5563] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5569] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5573] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5586] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5596] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5605] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (4e15af57-9404-42a2-9854-390a4ac53d3f)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5608] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5614] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5618] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5621] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5625] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5646] device (eth1): disconnecting for new activation request.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5648] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5654] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5658] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5661] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5668] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5679] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5688] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (311a043c-81fc-4daf-afc6-0462c489e96d)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5691] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5698] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5703] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5707] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5713] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5724] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5733] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (39b6f9c8-5e5e-4ac4-b1bb-1eafad15b332)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5735] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5741] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5744] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5746] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5751] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5761] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5770] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (7b47f90b-ad3b-4558-bea8-6d01988d22b5)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5772] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5779] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5782] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5785] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5789] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5815] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5819] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5826] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5829] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5843] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5850] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5856] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5862] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 kernel: ovs-system: entered promiscuous mode
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5865] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5873] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5883] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5891] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5894] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5899] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5904] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 kernel: Timeout policy base is empty
Nov 28 19:34:20 np0005539279 systemd-udevd[58478]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5908] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5910] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5916] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5920] dhcp4 (eth0): canceled DHCP transaction
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5920] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5921] dhcp4 (eth0): state changed no lease
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5922] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5934] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5937] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58474 uid=0 result="fail" reason="Device is not activated"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5944] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5953] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5962] device (eth1): disconnecting for new activation request.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5964] audit: op="connection-activate" uuid="f43141e8-2e9e-5be9-96d8-284c209fde47" name="ci-private-network" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5966] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.5970] dhcp4 (eth0): state changed new lease, address=38.102.83.110
Nov 28 19:34:20 np0005539279 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 19:34:20 np0005539279 kernel: br-ex: entered promiscuous mode
Nov 28 19:34:20 np0005539279 kernel: vlan22: entered promiscuous mode
Nov 28 19:34:20 np0005539279 kernel: vlan21: entered promiscuous mode
Nov 28 19:34:20 np0005539279 systemd-udevd[58480]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:34:20 np0005539279 kernel: vlan20: entered promiscuous mode
Nov 28 19:34:20 np0005539279 systemd-udevd[58479]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8160] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58474 uid=0 result="success"
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8202] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8216] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8230] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8243] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8245] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8280] device (eth1): Activation: starting connection 'ci-private-network' (f43141e8-2e9e-5be9-96d8-284c209fde47)
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8288] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8292] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8295] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8298] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8300] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8302] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8322] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8330] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8342] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8351] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8362] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8373] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8381] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8393] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8401] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8410] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8418] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8427] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8436] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8444] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8472] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8505] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8557] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8598] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8615] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8629] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8641] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8653] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8662] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8672] device (eth1): Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8680] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8683] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8686] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8692] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8701] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8712] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8723] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8733] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8744] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8758] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8763] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 19:34:20 np0005539279 NetworkManager[55703]: <info>  [1764376460.8776] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 19:34:22 np0005539279 python3.9[58808]: ansible-ansible.legacy.async_status Invoked with jid=j223799640406.58468 mode=status _async_dir=/root/.ansible_async
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.1362] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58474 uid=0 result="success"
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.3573] checkpoint[0x55812dd93950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.3576] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58474 uid=0 result="success"
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.6085] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58474 uid=0 result="success"
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.6095] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58474 uid=0 result="success"
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.7793] audit: op="networking-control" arg="global-dns-configuration" pid=58474 uid=0 result="success"
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.7820] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.7846] audit: op="networking-control" arg="global-dns-configuration" pid=58474 uid=0 result="success"
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.7865] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58474 uid=0 result="success"
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.9115] checkpoint[0x55812dd93a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 28 19:34:22 np0005539279 NetworkManager[55703]: <info>  [1764376462.9119] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58474 uid=0 result="success"
Nov 28 19:34:22 np0005539279 ansible-async_wrapper.py[58472]: Module complete (58472)
Nov 28 19:34:23 np0005539279 ansible-async_wrapper.py[58471]: Done in kid B.
Nov 28 19:34:25 np0005539279 python3.9[58918]: ansible-ansible.legacy.async_status Invoked with jid=j223799640406.58468 mode=status _async_dir=/root/.ansible_async
Nov 28 19:34:26 np0005539279 python3.9[59017]: ansible-ansible.legacy.async_status Invoked with jid=j223799640406.58468 mode=cleanup _async_dir=/root/.ansible_async
Nov 28 19:34:27 np0005539279 python3.9[59169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:34:27 np0005539279 python3.9[59292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376466.5124795-322-240141168552388/.source.returncode _original_basename=.gm6liiby follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:28 np0005539279 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 19:34:28 np0005539279 python3.9[59444]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:34:29 np0005539279 python3.9[59570]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376467.9834952-338-148026759179718/.source.cfg _original_basename=.u5uz5cxh follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:30 np0005539279 python3.9[59722]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:34:30 np0005539279 systemd[1]: Reloading Network Manager...
Nov 28 19:34:30 np0005539279 NetworkManager[55703]: <info>  [1764376470.1166] audit: op="reload" arg="0" pid=59726 uid=0 result="success"
Nov 28 19:34:30 np0005539279 NetworkManager[55703]: <info>  [1764376470.1175] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 28 19:34:30 np0005539279 systemd[1]: Reloaded Network Manager.
Nov 28 19:34:30 np0005539279 systemd[1]: session-12.scope: Deactivated successfully.
Nov 28 19:34:30 np0005539279 systemd[1]: session-12.scope: Consumed 54.133s CPU time.
Nov 28 19:34:30 np0005539279 systemd-logind[811]: Session 12 logged out. Waiting for processes to exit.
Nov 28 19:34:30 np0005539279 systemd-logind[811]: Removed session 12.
Nov 28 19:34:36 np0005539279 systemd-logind[811]: New session 13 of user zuul.
Nov 28 19:34:36 np0005539279 systemd[1]: Started Session 13 of User zuul.
Nov 28 19:34:37 np0005539279 python3.9[59913]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:34:38 np0005539279 python3.9[60067]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:34:39 np0005539279 python3.9[60259]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:34:39 np0005539279 systemd-logind[811]: Session 13 logged out. Waiting for processes to exit.
Nov 28 19:34:39 np0005539279 systemd[1]: session-13.scope: Deactivated successfully.
Nov 28 19:34:39 np0005539279 systemd[1]: session-13.scope: Consumed 2.506s CPU time.
Nov 28 19:34:39 np0005539279 systemd-logind[811]: Removed session 13.
Nov 28 19:34:40 np0005539279 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 19:34:44 np0005539279 systemd-logind[811]: New session 14 of user zuul.
Nov 28 19:34:44 np0005539279 systemd[1]: Started Session 14 of User zuul.
Nov 28 19:34:46 np0005539279 python3.9[60441]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:34:47 np0005539279 python3.9[60595]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:34:48 np0005539279 python3.9[60751]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:34:49 np0005539279 python3.9[60838]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:34:51 np0005539279 python3.9[60991]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:34:52 np0005539279 python3.9[61183]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:53 np0005539279 python3.9[61335]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:34:53 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:34:54 np0005539279 python3.9[61497]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:34:54 np0005539279 python3.9[61575]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:34:55 np0005539279 python3.9[61727]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:34:56 np0005539279 python3.9[61805]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:34:57 np0005539279 python3.9[61957]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:34:57 np0005539279 python3.9[62109]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:34:58 np0005539279 python3.9[62261]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:34:58 np0005539279 python3.9[62413]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:34:59 np0005539279 python3.9[62565]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:35:01 np0005539279 python3.9[62720]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:35:02 np0005539279 python3.9[62874]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:35:03 np0005539279 python3.9[63028]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:35:04 np0005539279 python3.9[63180]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:35:05 np0005539279 python3.9[63333]: ansible-service_facts Invoked
Nov 28 19:35:05 np0005539279 network[63350]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 19:35:05 np0005539279 network[63351]: 'network-scripts' will be removed from distribution in near future.
Nov 28 19:35:05 np0005539279 network[63352]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 19:35:12 np0005539279 python3.9[63804]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:35:14 np0005539279 python3.9[63957]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 28 19:35:16 np0005539279 python3.9[64109]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:16 np0005539279 python3.9[64234]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376515.4867-232-241946934973898/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:17 np0005539279 python3.9[64388]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:18 np0005539279 python3.9[64513]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376517.2650578-247-148998120638965/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:19 np0005539279 python3.9[64667]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:20 np0005539279 python3.9[64821]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:35:22 np0005539279 python3.9[64905]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:35:23 np0005539279 python3.9[65059]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:35:24 np0005539279 python3.9[65145]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:35:24 np0005539279 chronyd[781]: chronyd exiting
Nov 28 19:35:24 np0005539279 systemd[1]: Stopping NTP client/server...
Nov 28 19:35:24 np0005539279 systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 19:35:24 np0005539279 systemd[1]: Stopped NTP client/server.
Nov 28 19:35:24 np0005539279 systemd[1]: Starting NTP client/server...
Nov 28 19:35:24 np0005539279 chronyd[65153]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 19:35:24 np0005539279 chronyd[65153]: Frequency -28.416 +/- 0.354 ppm read from /var/lib/chrony/drift
Nov 28 19:35:24 np0005539279 chronyd[65153]: Loaded seccomp filter (level 2)
Nov 28 19:35:24 np0005539279 systemd[1]: Started NTP client/server.
Nov 28 19:35:24 np0005539279 systemd-logind[811]: Session 14 logged out. Waiting for processes to exit.
Nov 28 19:35:24 np0005539279 systemd[1]: session-14.scope: Deactivated successfully.
Nov 28 19:35:24 np0005539279 systemd[1]: session-14.scope: Consumed 27.193s CPU time.
Nov 28 19:35:24 np0005539279 systemd-logind[811]: Removed session 14.
Nov 28 19:35:30 np0005539279 systemd-logind[811]: New session 15 of user zuul.
Nov 28 19:35:30 np0005539279 systemd[1]: Started Session 15 of User zuul.
Nov 28 19:35:31 np0005539279 python3.9[65337]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:35:32 np0005539279 python3.9[65493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:33 np0005539279 python3.9[65668]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:34 np0005539279 python3.9[65746]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.ww9ngtdn recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:35 np0005539279 python3.9[65898]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:36 np0005539279 python3.9[66021]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376534.821423-61-122792196043104/.source _original_basename=.y14nprr9 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:36 np0005539279 python3.9[66173]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:35:37 np0005539279 python3.9[66325]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:38 np0005539279 python3.9[66450]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376537.0409336-85-240452807614458/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:35:38 np0005539279 python3.9[66602]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:39 np0005539279 python3.9[66725]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376538.3251152-85-10428279587001/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:35:40 np0005539279 python3.9[66877]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:40 np0005539279 python3.9[67029]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:41 np0005539279 python3.9[67152]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376540.2457283-122-249480684553054/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:42 np0005539279 python3.9[67304]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:42 np0005539279 python3.9[67427]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376541.6129923-137-68714274355932/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:43 np0005539279 python3.9[67583]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:35:43 np0005539279 systemd[1]: Reloading.
Nov 28 19:35:43 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:35:43 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:35:44 np0005539279 systemd[1]: Reloading.
Nov 28 19:35:44 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:35:44 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:35:44 np0005539279 systemd[1]: Starting EDPM Container Shutdown...
Nov 28 19:35:44 np0005539279 systemd[1]: Finished EDPM Container Shutdown.
Nov 28 19:35:45 np0005539279 python3.9[67813]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:45 np0005539279 python3.9[67936]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376544.694853-160-185857096363860/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:46 np0005539279 python3.9[68088]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:47 np0005539279 python3.9[68211]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376546.070161-175-63212057540787/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:35:48 np0005539279 python3.9[68363]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:35:48 np0005539279 systemd[1]: Reloading.
Nov 28 19:35:48 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:35:48 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:35:48 np0005539279 systemd[1]: Reloading.
Nov 28 19:35:48 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:35:48 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:35:48 np0005539279 systemd[1]: Starting Create netns directory...
Nov 28 19:35:48 np0005539279 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 19:35:48 np0005539279 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 19:35:48 np0005539279 systemd[1]: Finished Create netns directory.
Nov 28 19:35:49 np0005539279 python3.9[68590]: ansible-ansible.builtin.service_facts Invoked
Nov 28 19:35:49 np0005539279 network[68607]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 19:35:49 np0005539279 network[68608]: 'network-scripts' will be removed from distribution in near future.
Nov 28 19:35:49 np0005539279 network[68609]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 19:35:54 np0005539279 python3.9[68871]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:35:54 np0005539279 systemd[1]: Reloading.
Nov 28 19:35:54 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:35:54 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:35:54 np0005539279 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 28 19:35:54 np0005539279 iptables.init[68911]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 28 19:35:55 np0005539279 iptables.init[68911]: iptables: Flushing firewall rules: [  OK  ]
Nov 28 19:35:55 np0005539279 systemd[1]: iptables.service: Deactivated successfully.
Nov 28 19:35:55 np0005539279 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 28 19:35:55 np0005539279 python3.9[69107]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:35:56 np0005539279 python3.9[69261]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:35:56 np0005539279 systemd[1]: Reloading.
Nov 28 19:35:56 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:35:56 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:35:57 np0005539279 systemd[1]: Starting Netfilter Tables...
Nov 28 19:35:57 np0005539279 systemd[1]: Finished Netfilter Tables.
Nov 28 19:35:58 np0005539279 python3.9[69452]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:35:58 np0005539279 python3.9[69605]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:35:59 np0005539279 python3.9[69730]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376558.4802842-244-141878664913044/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:00 np0005539279 python3.9[69883]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:36:00 np0005539279 systemd[1]: Reloading OpenSSH server daemon...
Nov 28 19:36:00 np0005539279 systemd[1]: Reloaded OpenSSH server daemon.
Nov 28 19:36:01 np0005539279 python3.9[70039]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:02 np0005539279 python3.9[70193]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:02 np0005539279 python3.9[70318]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376561.4584017-275-3624792405426/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:03 np0005539279 python3.9[70470]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 19:36:03 np0005539279 systemd[1]: Starting Time & Date Service...
Nov 28 19:36:03 np0005539279 systemd[1]: Started Time & Date Service.
Nov 28 19:36:04 np0005539279 python3.9[70626]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:05 np0005539279 python3.9[70778]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:05 np0005539279 python3.9[70901]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376564.7111826-310-519642221660/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:06 np0005539279 python3.9[71053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:07 np0005539279 python3.9[71176]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376566.0525477-325-110961570364817/.source.yaml _original_basename=.m2joaqcj follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:07 np0005539279 python3.9[71328]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:08 np0005539279 python3.9[71451]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376567.3656359-340-66054673671412/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:09 np0005539279 python3.9[71603]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:36:09 np0005539279 python3.9[71756]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:36:11 np0005539279 python3[71909]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 19:36:12 np0005539279 python3.9[72061]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:13 np0005539279 python3.9[72184]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376572.095488-379-45530957763647/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:14 np0005539279 python3.9[72336]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:14 np0005539279 python3.9[72459]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376573.492027-394-236949861249350/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:15 np0005539279 python3.9[72611]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:16 np0005539279 python3.9[72734]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376574.9311278-409-277394547519353/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:16 np0005539279 python3.9[72886]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:17 np0005539279 python3.9[73009]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376576.3352084-424-80381077936218/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:18 np0005539279 python3.9[73163]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:36:18 np0005539279 python3.9[73286]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376577.726506-439-280419879083579/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:19 np0005539279 python3.9[73438]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:20 np0005539279 python3.9[73590]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:36:21 np0005539279 python3.9[73749]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:22 np0005539279 python3.9[73902]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:23 np0005539279 python3.9[74054]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:24 np0005539279 python3.9[74206]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 19:36:24 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 19:36:25 np0005539279 python3.9[74360]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 19:36:25 np0005539279 systemd[1]: session-15.scope: Deactivated successfully.
Nov 28 19:36:25 np0005539279 systemd[1]: session-15.scope: Consumed 39.551s CPU time.
Nov 28 19:36:25 np0005539279 systemd-logind[811]: Session 15 logged out. Waiting for processes to exit.
Nov 28 19:36:25 np0005539279 systemd-logind[811]: Removed session 15.
Nov 28 19:36:30 np0005539279 systemd-logind[811]: New session 16 of user zuul.
Nov 28 19:36:30 np0005539279 systemd[1]: Started Session 16 of User zuul.
Nov 28 19:36:31 np0005539279 python3.9[74543]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 28 19:36:32 np0005539279 python3.9[74697]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:36:33 np0005539279 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 19:36:33 np0005539279 python3.9[74849]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:36:34 np0005539279 python3.9[75006]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDhlvvqcS7ufbi1fcfKjqJeUSTCCEvf/q4opaBORSC8OoFtVlo+OWyt6xC4fAVLLehjJZE87yAIVJyqFx6XN99VKjhwQrH60yooDc98AJRpOJFcZNo0ZzATpzfkLoP758v5kVY4cSo1fN93hLKESaj+Hq7/WVeAp4YK0YL1EG+12gh2JuOtjNRDpbPtfWjM5GtppzivESz3jz3QWgy6rolb7M4XdaakLJDYunZalcBIoYPb7cr24x5vXmHzoJiIQaSh4ChyH51wXyIm8rsLbIiAD6kibjfTQmKNezlgdewDaFD+PpKX0VJMUlCPkjC3SulXNNlEzRziqqRTNu5NpA5YLc+gazs6Yy6FqIl4RFFkKgBetbdhVaCxC0JDWRzONhf+dMuaPJ4AHIDQXqCGEr26S2lIMrzuOyGYiYAWB3GiCaTnB+clgXNLvnKzS6105lD6znOfIFK1nxpTcJJ+tLvO+viGA9G3sft6wfmQCVo+jf2WKtj98pcJRtiyqTyw42U=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHHuVDAVqXZJL79tAi9gQj8DW9KRFFN+3NK2hPNdBflH#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOJ4/59ynYnBdI1yrpmK/GDkR+yyafKMK/Bp5hiOmHGk9QFK0KYRPBiqQjhkHdijDqF29PDxp1yxftJBk2aMPUA=#012 create=True mode=0644 path=/tmp/ansible.pmejxiad state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:35 np0005539279 python3.9[75158]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.pmejxiad' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:36:36 np0005539279 python3.9[75312]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.pmejxiad state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:36 np0005539279 systemd[1]: session-16.scope: Deactivated successfully.
Nov 28 19:36:36 np0005539279 systemd[1]: session-16.scope: Consumed 3.876s CPU time.
Nov 28 19:36:36 np0005539279 systemd-logind[811]: Session 16 logged out. Waiting for processes to exit.
Nov 28 19:36:36 np0005539279 systemd-logind[811]: Removed session 16.
Nov 28 19:36:42 np0005539279 systemd-logind[811]: New session 17 of user zuul.
Nov 28 19:36:42 np0005539279 systemd[1]: Started Session 17 of User zuul.
Nov 28 19:36:43 np0005539279 python3.9[75492]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:36:44 np0005539279 python3.9[75648]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 19:36:45 np0005539279 python3.9[75802]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:36:46 np0005539279 python3.9[75955]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:36:47 np0005539279 python3.9[76108]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:36:48 np0005539279 python3.9[76262]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:36:49 np0005539279 python3.9[76417]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:36:49 np0005539279 systemd[1]: session-17.scope: Deactivated successfully.
Nov 28 19:36:49 np0005539279 systemd[1]: session-17.scope: Consumed 4.905s CPU time.
Nov 28 19:36:49 np0005539279 systemd-logind[811]: Session 17 logged out. Waiting for processes to exit.
Nov 28 19:36:49 np0005539279 systemd-logind[811]: Removed session 17.
Nov 28 19:36:55 np0005539279 systemd-logind[811]: New session 18 of user zuul.
Nov 28 19:36:55 np0005539279 systemd[1]: Started Session 18 of User zuul.
Nov 28 19:36:56 np0005539279 python3.9[76599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:36:57 np0005539279 python3.9[76757]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:36:58 np0005539279 python3.9[76841]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 19:37:00 np0005539279 python3.9[76992]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:37:01 np0005539279 python3.9[77143]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 19:37:02 np0005539279 python3.9[77293]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:37:03 np0005539279 python3.9[77443]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:37:04 np0005539279 systemd[1]: session-18.scope: Deactivated successfully.
Nov 28 19:37:04 np0005539279 systemd[1]: session-18.scope: Consumed 6.561s CPU time.
Nov 28 19:37:04 np0005539279 systemd-logind[811]: Session 18 logged out. Waiting for processes to exit.
Nov 28 19:37:04 np0005539279 systemd-logind[811]: Removed session 18.
Nov 28 19:37:11 np0005539279 systemd-logind[811]: New session 19 of user zuul.
Nov 28 19:37:11 np0005539279 systemd[1]: Started Session 19 of User zuul.
Nov 28 19:37:12 np0005539279 python3.9[77625]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:37:14 np0005539279 python3.9[77781]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:15 np0005539279 python3.9[77933]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:16 np0005539279 python3.9[78087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:17 np0005539279 python3.9[78210]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376635.4799564-65-20625761926706/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=8b126925cf13e60cd58f2305019fc8709ca5a17b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:17 np0005539279 python3.9[78362]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:18 np0005539279 python3.9[78485]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376637.2621214-65-78913374858437/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a7ce68b4f4fcb00f12ab8525118232026d538d1e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:19 np0005539279 python3.9[78637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:19 np0005539279 python3.9[78760]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376638.5827653-65-253888793857220/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=b60d7b893c723893cb188ad8e4f0c6d9055b530a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:20 np0005539279 python3.9[78912]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:21 np0005539279 python3.9[79064]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:22 np0005539279 python3.9[79216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:22 np0005539279 python3.9[79339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376641.4926515-124-156786552979623/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=786f56fa594848d8ab99c573e36a139dec22707a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:23 np0005539279 python3.9[79491]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:23 np0005539279 python3.9[79614]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376642.8242514-124-83808752886925/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=468f425cf3d857862b7cac5c11794666f2c18d14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:24 np0005539279 python3.9[79766]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:25 np0005539279 python3.9[79889]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376644.100099-124-137142109747451/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=411c9e4b0e4b29741439c168b8ad05988e373613 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:26 np0005539279 python3.9[80041]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:26 np0005539279 python3.9[80193]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:27 np0005539279 python3.9[80345]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:28 np0005539279 python3.9[80468]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376647.1123116-183-68409119824783/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=4c98422f21b1c9d0c33894cd478679e667f20611 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:28 np0005539279 python3.9[80620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:29 np0005539279 python3.9[80743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376648.3821952-183-239922323292779/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a3dc55e7a4e8d2a6ef7eb5086bb6f4b4a44dca38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:30 np0005539279 python3.9[80895]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:30 np0005539279 python3.9[81018]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376649.7299325-183-38370057048567/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=dcb6e7a9a7500e1af2fc78241b37a29039eea74e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:31 np0005539279 python3.9[81170]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:32 np0005539279 python3.9[81322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:33 np0005539279 python3.9[81474]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:33 np0005539279 python3.9[81597]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376652.5935667-242-104010769718170/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e1289338933962a549f1e81466d61b3ee2c1105b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:33 np0005539279 chronyd[65153]: Selected source 216.197.228.230 (pool.ntp.org)
Nov 28 19:37:34 np0005539279 python3.9[81749]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:35 np0005539279 python3.9[81872]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376653.912682-242-151119446635713/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=a3dc55e7a4e8d2a6ef7eb5086bb6f4b4a44dca38 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:35 np0005539279 python3.9[82024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:36 np0005539279 python3.9[82147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376655.207774-242-50539769746823/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=b1b93d38d8debdaba1c7cadfe0d1d86d98d3e4af backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:37 np0005539279 python3.9[82301]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:38 np0005539279 python3.9[82453]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:39 np0005539279 python3.9[82578]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376657.8290558-310-46956183981615/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=d91c19d3d5949c7bafe97146b80f0a711185a1cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:39 np0005539279 python3.9[82730]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:40 np0005539279 python3.9[82882]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:41 np0005539279 python3.9[83005]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376660.1844468-334-272542751609769/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=d91c19d3d5949c7bafe97146b80f0a711185a1cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:42 np0005539279 python3.9[83157]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:42 np0005539279 python3.9[83309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:43 np0005539279 python3.9[83432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376662.3585072-358-212254370165966/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=d91c19d3d5949c7bafe97146b80f0a711185a1cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:44 np0005539279 python3.9[83584]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:45 np0005539279 python3.9[83736]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:45 np0005539279 python3.9[83859]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376664.6380444-382-254331264321680/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=d91c19d3d5949c7bafe97146b80f0a711185a1cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:46 np0005539279 python3.9[84011]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:47 np0005539279 python3.9[84163]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:47 np0005539279 python3.9[84286]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376666.7183173-406-215941193272616/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=d91c19d3d5949c7bafe97146b80f0a711185a1cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:48 np0005539279 python3.9[84438]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:49 np0005539279 python3.9[84590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:50 np0005539279 python3.9[84713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376669.0089943-430-19969976099155/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=d91c19d3d5949c7bafe97146b80f0a711185a1cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:51 np0005539279 python3.9[84865]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:37:51 np0005539279 python3.9[85019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:37:52 np0005539279 python3.9[85142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376671.2235324-454-277024567324267/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=d91c19d3d5949c7bafe97146b80f0a711185a1cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:37:52 np0005539279 systemd[1]: session-19.scope: Deactivated successfully.
Nov 28 19:37:52 np0005539279 systemd[1]: session-19.scope: Consumed 32.682s CPU time.
Nov 28 19:37:52 np0005539279 systemd-logind[811]: Session 19 logged out. Waiting for processes to exit.
Nov 28 19:37:52 np0005539279 systemd-logind[811]: Removed session 19.
Nov 28 19:37:58 np0005539279 systemd-logind[811]: New session 20 of user zuul.
Nov 28 19:37:58 np0005539279 systemd[1]: Started Session 20 of User zuul.
Nov 28 19:37:59 np0005539279 python3.9[85321]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:38:00 np0005539279 python3.9[85477]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:38:01 np0005539279 python3.9[85629]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:38:02 np0005539279 python3.9[85779]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:38:03 np0005539279 python3.9[85931]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 19:38:05 np0005539279 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 28 19:38:05 np0005539279 python3.9[86089]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:38:06 np0005539279 python3.9[86173]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:38:08 np0005539279 python3.9[86326]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 19:38:09 np0005539279 python3[86481]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 28 19:38:10 np0005539279 python3.9[86638]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:11 np0005539279 python3.9[86790]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:12 np0005539279 python3.9[86868]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:13 np0005539279 python3.9[87020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:13 np0005539279 python3.9[87100]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mypexrrc recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:14 np0005539279 python3.9[87252]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:15 np0005539279 python3.9[87330]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:16 np0005539279 python3.9[87482]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:38:17 np0005539279 python3[87635]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 19:38:17 np0005539279 python3.9[87787]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:18 np0005539279 python3.9[87912]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376697.2660685-157-210912760814855/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:19 np0005539279 python3.9[88064]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:20 np0005539279 python3.9[88189]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376698.8564577-172-164997186846885/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:20 np0005539279 python3.9[88341]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:21 np0005539279 python3.9[88467]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376700.3481443-187-242270210413682/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:22 np0005539279 python3.9[88619]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:23 np0005539279 python3.9[88744]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376701.761024-202-184048753773627/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:23 np0005539279 python3.9[88896]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:24 np0005539279 python3.9[89021]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764376703.2068946-217-54838768911465/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:25 np0005539279 python3.9[89173]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:25 np0005539279 python3.9[89325]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:38:26 np0005539279 python3.9[89480]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:27 np0005539279 python3.9[89632]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:38:28 np0005539279 python3.9[89785]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:38:29 np0005539279 python3.9[89941]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:38:29 np0005539279 python3.9[90096]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:31 np0005539279 python3.9[90246]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:38:32 np0005539279 python3.9[90399]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:38:32 np0005539279 ovs-vsctl[90400]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 28 19:38:33 np0005539279 python3.9[90552]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:38:34 np0005539279 python3.9[90707]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:38:34 np0005539279 ovs-vsctl[90708]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 28 19:38:34 np0005539279 python3.9[90858]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:38:35 np0005539279 python3.9[91012]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:38:36 np0005539279 python3.9[91164]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:37 np0005539279 python3.9[91242]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:38:37 np0005539279 python3.9[91394]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:38 np0005539279 python3.9[91472]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:38:39 np0005539279 python3.9[91624]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:39 np0005539279 python3.9[91776]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:40 np0005539279 python3.9[91854]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:41 np0005539279 python3.9[92006]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:41 np0005539279 python3.9[92084]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:42 np0005539279 python3.9[92236]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:38:42 np0005539279 systemd[1]: Reloading.
Nov 28 19:38:42 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:38:42 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:38:43 np0005539279 python3.9[92427]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:44 np0005539279 python3.9[92507]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:45 np0005539279 python3.9[92659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:45 np0005539279 python3.9[92737]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:46 np0005539279 python3.9[92889]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:38:46 np0005539279 systemd[1]: Reloading.
Nov 28 19:38:46 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:38:46 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:38:46 np0005539279 systemd[1]: Starting Create netns directory...
Nov 28 19:38:46 np0005539279 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 19:38:46 np0005539279 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 19:38:46 np0005539279 systemd[1]: Finished Create netns directory.
Nov 28 19:38:47 np0005539279 python3.9[93083]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:38:48 np0005539279 python3.9[93235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:49 np0005539279 python3.9[93358]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376728.0436141-468-69621567397831/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:38:50 np0005539279 python3.9[93510]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:38:51 np0005539279 python3.9[93662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:38:51 np0005539279 python3.9[93785]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376730.5803926-493-117130793446409/.source.json _original_basename=.ihre4j8p follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:52 np0005539279 python3.9[93937]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:38:55 np0005539279 python3.9[94368]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 28 19:38:56 np0005539279 python3.9[94520]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:38:57 np0005539279 python3.9[94672]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 19:38:57 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:38:58 np0005539279 python3[94835]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:38:58 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:38:58 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:38:58 np0005539279 podman[94872]: 2025-11-29 00:38:58.970515375 +0000 UTC m=+0.052537618 container create 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 19:38:58 np0005539279 podman[94872]: 2025-11-29 00:38:58.946259798 +0000 UTC m=+0.028282061 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 19:38:58 np0005539279 python3[94835]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 19:38:59 np0005539279 python3.9[95060]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:38:59 np0005539279 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 19:39:00 np0005539279 python3.9[95214]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:01 np0005539279 python3.9[95290]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:39:02 np0005539279 python3.9[95441]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764376741.1825476-581-21477473375279/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:02 np0005539279 python3.9[95517]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:39:02 np0005539279 systemd[1]: Reloading.
Nov 28 19:39:02 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:39:02 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:39:03 np0005539279 python3.9[95629]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:39:03 np0005539279 systemd[1]: Reloading.
Nov 28 19:39:03 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:39:03 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:39:03 np0005539279 systemd[1]: Starting ovn_controller container...
Nov 28 19:39:03 np0005539279 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 28 19:39:03 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:39:04 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/739519c66440a2d8dcacae21977003f9edca9ec367f1f1069af330f06bd0118e/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 19:39:04 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f.
Nov 28 19:39:04 np0005539279 podman[95671]: 2025-11-29 00:39:04.055842468 +0000 UTC m=+0.170959562 container init 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + sudo -E kolla_set_configs
Nov 28 19:39:04 np0005539279 podman[95671]: 2025-11-29 00:39:04.092130073 +0000 UTC m=+0.207247107 container start 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 19:39:04 np0005539279 edpm-start-podman-container[95671]: ovn_controller
Nov 28 19:39:04 np0005539279 systemd[1]: Created slice User Slice of UID 0.
Nov 28 19:39:04 np0005539279 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 19:39:04 np0005539279 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 19:39:04 np0005539279 systemd[1]: Starting User Manager for UID 0...
Nov 28 19:39:04 np0005539279 edpm-start-podman-container[95670]: Creating additional drop-in dependency for "ovn_controller" (0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f)
Nov 28 19:39:04 np0005539279 systemd[1]: Reloading.
Nov 28 19:39:04 np0005539279 podman[95693]: 2025-11-29 00:39:04.203947632 +0000 UTC m=+0.093824924 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 19:39:04 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:39:04 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:39:04 np0005539279 systemd[95718]: Queued start job for default target Main User Target.
Nov 28 19:39:04 np0005539279 systemd[95718]: Created slice User Application Slice.
Nov 28 19:39:04 np0005539279 systemd[95718]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 19:39:04 np0005539279 systemd[95718]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 19:39:04 np0005539279 systemd[95718]: Reached target Paths.
Nov 28 19:39:04 np0005539279 systemd[95718]: Reached target Timers.
Nov 28 19:39:04 np0005539279 systemd[95718]: Starting D-Bus User Message Bus Socket...
Nov 28 19:39:04 np0005539279 systemd[95718]: Starting Create User's Volatile Files and Directories...
Nov 28 19:39:04 np0005539279 systemd[95718]: Finished Create User's Volatile Files and Directories.
Nov 28 19:39:04 np0005539279 systemd[95718]: Listening on D-Bus User Message Bus Socket.
Nov 28 19:39:04 np0005539279 systemd[95718]: Reached target Sockets.
Nov 28 19:39:04 np0005539279 systemd[95718]: Reached target Basic System.
Nov 28 19:39:04 np0005539279 systemd[95718]: Reached target Main User Target.
Nov 28 19:39:04 np0005539279 systemd[95718]: Startup finished in 153ms.
Nov 28 19:39:04 np0005539279 systemd[1]: Started User Manager for UID 0.
Nov 28 19:39:04 np0005539279 systemd[1]: Started ovn_controller container.
Nov 28 19:39:04 np0005539279 systemd[1]: 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f-109cb0ad5f10f60d.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 19:39:04 np0005539279 systemd[1]: 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f-109cb0ad5f10f60d.service: Failed with result 'exit-code'.
Nov 28 19:39:04 np0005539279 systemd[1]: Started Session c1 of User root.
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: INFO:__main__:Validating config file
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: INFO:__main__:Writing out command to execute
Nov 28 19:39:04 np0005539279 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: ++ cat /run_command
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + ARGS=
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + sudo kolla_copy_cacerts
Nov 28 19:39:04 np0005539279 systemd[1]: Started Session c2 of User root.
Nov 28 19:39:04 np0005539279 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + [[ ! -n '' ]]
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + . kolla_extend_start
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + umask 0022
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 28 19:39:04 np0005539279 NetworkManager[55703]: <info>  [1764376744.6006] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 28 19:39:04 np0005539279 NetworkManager[55703]: <info>  [1764376744.6017] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:39:04 np0005539279 NetworkManager[55703]: <info>  [1764376744.6033] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 28 19:39:04 np0005539279 NetworkManager[55703]: <info>  [1764376744.6041] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 28 19:39:04 np0005539279 NetworkManager[55703]: <info>  [1764376744.6047] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 19:39:04 np0005539279 kernel: br-int: entered promiscuous mode
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 19:39:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:04Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 19:39:04 np0005539279 NetworkManager[55703]: <info>  [1764376744.6277] manager: (ovn-20c0de-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 28 19:39:04 np0005539279 kernel: genev_sys_6081: entered promiscuous mode
Nov 28 19:39:04 np0005539279 systemd-udevd[95843]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:39:04 np0005539279 systemd-udevd[95842]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:39:04 np0005539279 NetworkManager[55703]: <info>  [1764376744.6466] device (genev_sys_6081): carrier: link connected
Nov 28 19:39:04 np0005539279 NetworkManager[55703]: <info>  [1764376744.6470] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 28 19:39:05 np0005539279 python3.9[95951]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:39:05 np0005539279 ovs-vsctl[95952]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 28 19:39:05 np0005539279 python3.9[96104]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:39:06 np0005539279 ovs-vsctl[96106]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 28 19:39:07 np0005539279 python3.9[96259]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:39:07 np0005539279 ovs-vsctl[96260]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 28 19:39:07 np0005539279 systemd[1]: session-20.scope: Deactivated successfully.
Nov 28 19:39:07 np0005539279 systemd[1]: session-20.scope: Consumed 52.136s CPU time.
Nov 28 19:39:07 np0005539279 systemd-logind[811]: Session 20 logged out. Waiting for processes to exit.
Nov 28 19:39:07 np0005539279 systemd-logind[811]: Removed session 20.
Nov 28 19:39:13 np0005539279 systemd-logind[811]: New session 22 of user zuul.
Nov 28 19:39:13 np0005539279 systemd[1]: Started Session 22 of User zuul.
Nov 28 19:39:14 np0005539279 python3.9[96440]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:39:14 np0005539279 systemd[1]: Stopping User Manager for UID 0...
Nov 28 19:39:14 np0005539279 systemd[95718]: Activating special unit Exit the Session...
Nov 28 19:39:14 np0005539279 systemd[95718]: Stopped target Main User Target.
Nov 28 19:39:14 np0005539279 systemd[95718]: Stopped target Basic System.
Nov 28 19:39:14 np0005539279 systemd[95718]: Stopped target Paths.
Nov 28 19:39:14 np0005539279 systemd[95718]: Stopped target Sockets.
Nov 28 19:39:14 np0005539279 systemd[95718]: Stopped target Timers.
Nov 28 19:39:14 np0005539279 systemd[95718]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 19:39:14 np0005539279 systemd[95718]: Closed D-Bus User Message Bus Socket.
Nov 28 19:39:14 np0005539279 systemd[95718]: Stopped Create User's Volatile Files and Directories.
Nov 28 19:39:14 np0005539279 systemd[95718]: Removed slice User Application Slice.
Nov 28 19:39:14 np0005539279 systemd[95718]: Reached target Shutdown.
Nov 28 19:39:14 np0005539279 systemd[95718]: Finished Exit the Session.
Nov 28 19:39:14 np0005539279 systemd[95718]: Reached target Exit the Session.
Nov 28 19:39:14 np0005539279 systemd[1]: user@0.service: Deactivated successfully.
Nov 28 19:39:14 np0005539279 systemd[1]: Stopped User Manager for UID 0.
Nov 28 19:39:14 np0005539279 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 19:39:14 np0005539279 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 19:39:14 np0005539279 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 19:39:14 np0005539279 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 19:39:14 np0005539279 systemd[1]: Removed slice User Slice of UID 0.
Nov 28 19:39:15 np0005539279 python3.9[96600]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:16 np0005539279 python3.9[96756]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:17 np0005539279 python3.9[96908]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:17 np0005539279 python3.9[97060]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:18 np0005539279 python3.9[97212]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:19 np0005539279 python3.9[97362]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:39:20 np0005539279 python3.9[97514]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 19:39:21 np0005539279 python3.9[97664]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:22 np0005539279 python3.9[97785]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376761.0959325-86-151885746495133/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:23 np0005539279 python3.9[97936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:24 np0005539279 python3.9[98057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376762.7478924-101-97566664851904/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:25 np0005539279 python3.9[98209]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:39:26 np0005539279 python3.9[98293]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:39:28 np0005539279 python3.9[98446]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 19:39:30 np0005539279 python3.9[98603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:31 np0005539279 python3.9[98724]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376770.1591613-138-241301201792902/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:32 np0005539279 python3.9[98874]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:32 np0005539279 python3.9[98995]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376771.5830436-138-57687938746231/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:34 np0005539279 python3.9[99145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:34 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:34Z|00025|memory|INFO|16128 kB peak resident set size after 30.0 seconds
Nov 28 19:39:34 np0005539279 ovn_controller[95686]: 2025-11-29T00:39:34Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Nov 28 19:39:34 np0005539279 podman[99240]: 2025-11-29 00:39:34.563106537 +0000 UTC m=+0.091314004 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 19:39:34 np0005539279 python3.9[99285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376773.5521402-182-275407323274314/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:35 np0005539279 python3.9[99445]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:35 np0005539279 python3.9[99566]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376774.8915663-182-131669849759571/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:36 np0005539279 python3.9[99716]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:39:37 np0005539279 python3.9[99870]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:38 np0005539279 python3.9[100022]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:38 np0005539279 python3.9[100100]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:39 np0005539279 python3.9[100252]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:39 np0005539279 python3.9[100330]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:40 np0005539279 python3.9[100482]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:41 np0005539279 python3.9[100634]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:42 np0005539279 python3.9[100714]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:42 np0005539279 python3.9[100866]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:43 np0005539279 python3.9[100944]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:44 np0005539279 python3.9[101096]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:39:44 np0005539279 systemd[1]: Reloading.
Nov 28 19:39:44 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:39:44 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:39:46 np0005539279 python3.9[101285]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:47 np0005539279 python3.9[101363]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:47 np0005539279 python3.9[101515]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:48 np0005539279 python3.9[101593]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:49 np0005539279 python3.9[101747]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:39:49 np0005539279 systemd[1]: Reloading.
Nov 28 19:39:49 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:39:49 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:39:49 np0005539279 systemd[1]: Starting Create netns directory...
Nov 28 19:39:49 np0005539279 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 19:39:49 np0005539279 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 19:39:49 np0005539279 systemd[1]: Finished Create netns directory.
Nov 28 19:39:50 np0005539279 python3.9[101940]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:51 np0005539279 python3.9[102092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:51 np0005539279 python3.9[102215]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764376790.5723279-333-187125682665124/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:52 np0005539279 python3.9[102367]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:39:53 np0005539279 python3.9[102519]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:39:54 np0005539279 python3.9[102642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764376793.0128973-358-216370382103339/.source.json _original_basename=.0duifwne follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:55 np0005539279 python3.9[102794]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:39:57 np0005539279 python3.9[103221]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 28 19:39:58 np0005539279 python3.9[103373]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:39:59 np0005539279 python3.9[103525]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 19:40:00 np0005539279 python3[103701]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:40:01 np0005539279 podman[103739]: 2025-11-29 00:40:01.198872036 +0000 UTC m=+0.060534646 container create dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:40:01 np0005539279 podman[103739]: 2025-11-29 00:40:01.175982401 +0000 UTC m=+0.037645041 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:40:01 np0005539279 python3[103701]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:40:02 np0005539279 python3.9[103929]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:40:02 np0005539279 python3.9[104083]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:03 np0005539279 python3.9[104159]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:40:04 np0005539279 python3.9[104310]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764376803.4131694-446-83367248795099/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:04 np0005539279 python3.9[104386]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:40:04 np0005539279 systemd[1]: Reloading.
Nov 28 19:40:04 np0005539279 podman[104387]: 2025-11-29 00:40:04.899926067 +0000 UTC m=+0.138128017 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 19:40:04 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:40:04 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:40:05 np0005539279 python3.9[104523]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:40:05 np0005539279 systemd[1]: Reloading.
Nov 28 19:40:05 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:40:05 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:40:06 np0005539279 systemd[1]: Starting ovn_metadata_agent container...
Nov 28 19:40:06 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:40:06 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f83fc8895a355dc7b3c7fcbe7588530f5f5c4e3334329c6ceea8d34e56fbea4/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 19:40:06 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f83fc8895a355dc7b3c7fcbe7588530f5f5c4e3334329c6ceea8d34e56fbea4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 19:40:06 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da.
Nov 28 19:40:06 np0005539279 podman[104564]: 2025-11-29 00:40:06.23952189 +0000 UTC m=+0.168387775 container init dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + sudo -E kolla_set_configs
Nov 28 19:40:06 np0005539279 podman[104564]: 2025-11-29 00:40:06.2709764 +0000 UTC m=+0.199842255 container start dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 19:40:06 np0005539279 edpm-start-podman-container[104564]: ovn_metadata_agent
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Validating config file
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Copying service configuration files
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Writing out command to execute
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: ++ cat /run_command
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + CMD=neutron-ovn-metadata-agent
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + ARGS=
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + sudo kolla_copy_cacerts
Nov 28 19:40:06 np0005539279 podman[104586]: 2025-11-29 00:40:06.348245422 +0000 UTC m=+0.062943592 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Nov 28 19:40:06 np0005539279 edpm-start-podman-container[104563]: Creating additional drop-in dependency for "ovn_metadata_agent" (dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da)
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + [[ ! -n '' ]]
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + . kolla_extend_start
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + umask 0022
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: + exec neutron-ovn-metadata-agent
Nov 28 19:40:06 np0005539279 ovn_metadata_agent[104579]: Running command: 'neutron-ovn-metadata-agent'
Nov 28 19:40:06 np0005539279 systemd[1]: Reloading.
Nov 28 19:40:06 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:40:06 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:40:06 np0005539279 systemd[1]: Started ovn_metadata_agent container.
Nov 28 19:40:07 np0005539279 systemd[1]: session-22.scope: Deactivated successfully.
Nov 28 19:40:07 np0005539279 systemd[1]: session-22.scope: Consumed 39.369s CPU time.
Nov 28 19:40:07 np0005539279 systemd-logind[811]: Session 22 logged out. Waiting for processes to exit.
Nov 28 19:40:07 np0005539279 systemd-logind[811]: Removed session 22.
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.027 104584 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.027 104584 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.027 104584 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.028 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.028 104584 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.028 104584 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.028 104584 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.028 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.029 104584 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.029 104584 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.029 104584 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.029 104584 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.029 104584 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.029 104584 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.029 104584 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.030 104584 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.030 104584 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.030 104584 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.030 104584 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.030 104584 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.030 104584 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.030 104584 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.031 104584 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.031 104584 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.031 104584 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.031 104584 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.031 104584 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.031 104584 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.031 104584 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.031 104584 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.032 104584 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.032 104584 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.032 104584 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.032 104584 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.032 104584 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.032 104584 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.032 104584 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.032 104584 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.033 104584 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.033 104584 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.033 104584 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.033 104584 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.033 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.033 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.033 104584 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.034 104584 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.035 104584 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.035 104584 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.035 104584 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.035 104584 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.035 104584 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.035 104584 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.035 104584 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.035 104584 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.036 104584 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.036 104584 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.036 104584 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.036 104584 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.036 104584 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.036 104584 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.036 104584 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.037 104584 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.037 104584 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.037 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.037 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.037 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.037 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.037 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.038 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.038 104584 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.038 104584 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.038 104584 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.038 104584 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.038 104584 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.038 104584 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.038 104584 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.039 104584 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.039 104584 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.039 104584 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.039 104584 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.039 104584 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.039 104584 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.039 104584 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.040 104584 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.040 104584 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.040 104584 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.040 104584 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.040 104584 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.040 104584 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.040 104584 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.041 104584 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.041 104584 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.041 104584 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.041 104584 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.041 104584 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.041 104584 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.041 104584 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.041 104584 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.042 104584 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.042 104584 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.042 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.042 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.042 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.042 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.043 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.043 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.043 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.043 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.043 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.043 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.043 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.044 104584 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.044 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.044 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.044 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.044 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.044 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.044 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.045 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.045 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.045 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.045 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.045 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.045 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.045 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.046 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.046 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.046 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.046 104584 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.046 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.046 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.046 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.047 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.047 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.047 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.047 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.047 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.047 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.047 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.048 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.048 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.048 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.048 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.048 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.048 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.048 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.049 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.050 104584 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.051 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.052 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.053 104584 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.054 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.055 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.056 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.057 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.058 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.059 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.060 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.061 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.062 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.062 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.062 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.063 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.063 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.063 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.063 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.063 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.064 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.064 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.064 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.064 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.064 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.064 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.065 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.065 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.065 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.065 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.065 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.065 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.065 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.066 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.066 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.066 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.066 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.066 104584 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.066 104584 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.077 104584 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.077 104584 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.077 104584 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.077 104584 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.078 104584 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.091 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name bb6a090d-c99b-4a6a-9b20-ad4330625b75 (UUID: bb6a090d-c99b-4a6a-9b20-ad4330625b75) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.117 104584 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.117 104584 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.117 104584 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.117 104584 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.120 104584 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.126 104584 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.132 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'bb6a090d-c99b-4a6a-9b20-ad4330625b75'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], external_ids={}, name=bb6a090d-c99b-4a6a-9b20-ad4330625b75, nb_cfg_timestamp=1764376752623, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.133 104584 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f6c87ca8160>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.134 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.134 104584 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.134 104584 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.134 104584 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.138 104584 DEBUG oslo_service.service [-] Started child 104693 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.141 104584 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp_bdast8e/privsep.sock']#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.144 104693 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-231742'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.175 104693 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.175 104693 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.176 104693 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.180 104693 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.189 104693 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.198 104693 INFO eventlet.wsgi.server [-] (104693) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 28 19:40:08 np0005539279 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.871 104584 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.872 104584 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_bdast8e/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.704 104698 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.710 104698 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.714 104698 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.715 104698 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104698#033[00m
Nov 28 19:40:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:08.876 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[124d45f0-e151-4e2e-b118-018947c039d7]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.332 104698 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.332 104698 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.332 104698 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.812 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e42fc1-cbe6-477c-9ca0-8420f9d5a68f]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.816 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, column=external_ids, values=({'neutron:ovn-metadata-id': 'e8504261-912e-5bb4-ba81-3b502ef5a8d1'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.831 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.839 104584 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.840 104584 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.840 104584 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.840 104584 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.840 104584 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.840 104584 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.841 104584 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.841 104584 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.841 104584 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.842 104584 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.842 104584 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.842 104584 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.842 104584 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.843 104584 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.843 104584 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.843 104584 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.843 104584 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.844 104584 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.844 104584 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.844 104584 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.844 104584 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.845 104584 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.845 104584 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.845 104584 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.845 104584 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.846 104584 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.846 104584 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.846 104584 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.847 104584 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.847 104584 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.847 104584 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.847 104584 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.848 104584 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.848 104584 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.848 104584 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.848 104584 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.849 104584 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.849 104584 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.849 104584 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.850 104584 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.850 104584 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.850 104584 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.850 104584 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.851 104584 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.851 104584 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.851 104584 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.851 104584 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.852 104584 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.852 104584 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.852 104584 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.852 104584 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.852 104584 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.853 104584 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.853 104584 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.853 104584 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.853 104584 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.853 104584 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.854 104584 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.854 104584 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.854 104584 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.854 104584 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.855 104584 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.855 104584 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.855 104584 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.855 104584 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.856 104584 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.856 104584 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.856 104584 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.856 104584 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.857 104584 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.857 104584 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.857 104584 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.857 104584 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.857 104584 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.858 104584 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.858 104584 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.858 104584 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.858 104584 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.859 104584 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.859 104584 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.859 104584 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.859 104584 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.860 104584 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.860 104584 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.860 104584 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.860 104584 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.861 104584 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.861 104584 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.861 104584 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.861 104584 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.862 104584 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.862 104584 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.862 104584 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.862 104584 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.862 104584 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.863 104584 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.863 104584 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.863 104584 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.863 104584 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.863 104584 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.864 104584 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.864 104584 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.864 104584 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.864 104584 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.864 104584 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.865 104584 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.865 104584 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.865 104584 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.865 104584 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.866 104584 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.866 104584 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.866 104584 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.867 104584 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.867 104584 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.867 104584 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.867 104584 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.867 104584 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.868 104584 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.868 104584 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.868 104584 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.868 104584 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.869 104584 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.869 104584 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.869 104584 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.869 104584 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.870 104584 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.870 104584 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.870 104584 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.870 104584 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.871 104584 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.871 104584 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.871 104584 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.871 104584 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.872 104584 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.872 104584 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.872 104584 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.872 104584 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.873 104584 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.873 104584 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.873 104584 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.873 104584 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.874 104584 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.874 104584 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.874 104584 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.874 104584 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.875 104584 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.875 104584 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.875 104584 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.875 104584 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.876 104584 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.876 104584 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.876 104584 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.876 104584 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.877 104584 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.877 104584 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.877 104584 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.877 104584 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.877 104584 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.877 104584 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.878 104584 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.878 104584 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.878 104584 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.878 104584 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.879 104584 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.879 104584 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.879 104584 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.879 104584 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.880 104584 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.880 104584 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.880 104584 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.880 104584 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.881 104584 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.881 104584 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.881 104584 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.881 104584 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.881 104584 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.881 104584 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.882 104584 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.882 104584 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.882 104584 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.882 104584 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.882 104584 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.882 104584 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.883 104584 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.883 104584 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.883 104584 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.883 104584 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.883 104584 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.884 104584 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.884 104584 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.884 104584 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.884 104584 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.884 104584 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.884 104584 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.884 104584 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.885 104584 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.885 104584 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.885 104584 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.885 104584 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.885 104584 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.885 104584 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.885 104584 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.886 104584 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.886 104584 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.886 104584 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.886 104584 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.886 104584 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.886 104584 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.887 104584 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.887 104584 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.887 104584 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.887 104584 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.887 104584 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.887 104584 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.887 104584 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.888 104584 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.888 104584 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.888 104584 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.888 104584 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.888 104584 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.888 104584 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.888 104584 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.889 104584 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.889 104584 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.889 104584 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.889 104584 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.889 104584 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.889 104584 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.889 104584 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.889 104584 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.890 104584 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.890 104584 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.890 104584 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.890 104584 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.890 104584 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.890 104584 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.890 104584 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.891 104584 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.891 104584 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.891 104584 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.891 104584 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.891 104584 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.891 104584 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.891 104584 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.892 104584 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.892 104584 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.892 104584 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.892 104584 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.892 104584 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.892 104584 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.893 104584 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.893 104584 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.893 104584 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.893 104584 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.893 104584 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.894 104584 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.894 104584 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.894 104584 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.894 104584 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.894 104584 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.895 104584 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.895 104584 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.895 104584 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.895 104584 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.895 104584 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.895 104584 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.895 104584 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.896 104584 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.896 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.896 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.896 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.896 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.896 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.897 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.897 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.897 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.897 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.897 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.897 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.898 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.898 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.898 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.898 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.898 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.898 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.899 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.899 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.899 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.899 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.899 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.900 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.900 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.900 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.900 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.900 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.900 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.901 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.901 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.901 104584 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.901 104584 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.901 104584 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.901 104584 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.902 104584 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:40:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:40:09.902 104584 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 28 19:40:13 np0005539279 systemd-logind[811]: New session 23 of user zuul.
Nov 28 19:40:13 np0005539279 systemd[1]: Started Session 23 of User zuul.
Nov 28 19:40:14 np0005539279 python3.9[104856]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:40:15 np0005539279 python3.9[105012]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:17 np0005539279 python3.9[105177]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:40:17 np0005539279 systemd[1]: Reloading.
Nov 28 19:40:17 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:40:17 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:40:18 np0005539279 python3.9[105362]: ansible-ansible.builtin.service_facts Invoked
Nov 28 19:40:18 np0005539279 network[105379]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 19:40:18 np0005539279 network[105380]: 'network-scripts' will be removed from distribution in near future.
Nov 28 19:40:18 np0005539279 network[105381]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 19:40:24 np0005539279 python3.9[105647]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:40:25 np0005539279 python3.9[105800]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:40:26 np0005539279 python3.9[105953]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:40:27 np0005539279 python3.9[106108]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:40:28 np0005539279 python3.9[106261]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:40:28 np0005539279 python3.9[106414]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:40:29 np0005539279 python3.9[106567]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:40:31 np0005539279 python3.9[106722]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:31 np0005539279 python3.9[106874]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:32 np0005539279 python3.9[107026]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:33 np0005539279 python3.9[107178]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:33 np0005539279 python3.9[107330]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:34 np0005539279 python3.9[107482]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:35 np0005539279 podman[107606]: 2025-11-29 00:40:35.339411365 +0000 UTC m=+0.159493992 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 19:40:35 np0005539279 python3.9[107647]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:36 np0005539279 python3.9[107812]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:36 np0005539279 podman[107936]: 2025-11-29 00:40:36.839765771 +0000 UTC m=+0.080310176 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 19:40:37 np0005539279 python3.9[107981]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:37 np0005539279 python3.9[108135]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:38 np0005539279 python3.9[108287]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:39 np0005539279 python3.9[108439]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:39 np0005539279 python3.9[108591]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:40 np0005539279 python3.9[108743]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:40:41 np0005539279 python3.9[108895]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:42 np0005539279 python3.9[109047]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 19:40:43 np0005539279 python3.9[109199]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:40:43 np0005539279 systemd[1]: Reloading.
Nov 28 19:40:43 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:40:43 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:40:44 np0005539279 python3.9[109387]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:45 np0005539279 python3.9[109540]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:46 np0005539279 python3.9[109693]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:46 np0005539279 python3.9[109846]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:47 np0005539279 python3.9[109999]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:48 np0005539279 python3.9[110152]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:49 np0005539279 python3.9[110305]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:40:50 np0005539279 python3.9[110458]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 28 19:40:51 np0005539279 python3.9[110615]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 19:40:52 np0005539279 python3.9[110775]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 19:40:53 np0005539279 python3.9[110935]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:40:54 np0005539279 python3.9[111019]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:41:05 np0005539279 podman[111106]: 2025-11-29 00:41:05.856295255 +0000 UTC m=+0.092759774 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 19:41:07 np0005539279 podman[111185]: 2025-11-29 00:41:07.835506813 +0000 UTC m=+0.075352990 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Nov 28 19:41:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:41:08.068 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:41:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:41:08.069 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:41:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:41:08.069 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:41:24 np0005539279 kernel: SELinux:  Converting 2757 SID table entries...
Nov 28 19:41:24 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:41:24 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:41:24 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:41:24 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:41:24 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:41:24 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:41:24 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:41:34 np0005539279 kernel: SELinux:  Converting 2757 SID table entries...
Nov 28 19:41:34 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:41:34 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:41:34 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:41:34 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:41:34 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:41:34 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:41:34 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:41:36 np0005539279 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 28 19:41:36 np0005539279 podman[111275]: 2025-11-29 00:41:36.840437041 +0000 UTC m=+0.130243948 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:41:38 np0005539279 podman[111303]: 2025-11-29 00:41:38.828807121 +0000 UTC m=+0.068652762 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 19:42:07 np0005539279 podman[121292]: 2025-11-29 00:42:07.96180097 +0000 UTC m=+0.203454632 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:42:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:42:08.069 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:42:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:42:08.071 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:42:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:42:08.071 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:42:09 np0005539279 podman[122120]: 2025-11-29 00:42:09.840697357 +0000 UTC m=+0.083229651 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 19:42:35 np0005539279 kernel: SELinux:  Converting 2759 SID table entries...
Nov 28 19:42:35 np0005539279 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 19:42:35 np0005539279 kernel: SELinux:  policy capability open_perms=1
Nov 28 19:42:35 np0005539279 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 19:42:35 np0005539279 kernel: SELinux:  policy capability always_check_network=0
Nov 28 19:42:35 np0005539279 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 19:42:35 np0005539279 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 19:42:35 np0005539279 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 19:42:37 np0005539279 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Nov 28 19:42:37 np0005539279 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 28 19:42:37 np0005539279 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Nov 28 19:42:38 np0005539279 podman[128225]: 2025-11-29 00:42:38.216809333 +0000 UTC m=+0.152758906 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 19:42:40 np0005539279 podman[128284]: 2025-11-29 00:42:40.737525668 +0000 UTC m=+0.113821562 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 28 19:42:45 np0005539279 systemd[1]: Stopping OpenSSH server daemon...
Nov 28 19:42:45 np0005539279 systemd[1]: sshd.service: Deactivated successfully.
Nov 28 19:42:45 np0005539279 systemd[1]: Stopped OpenSSH server daemon.
Nov 28 19:42:45 np0005539279 systemd[1]: sshd.service: Consumed 7.290s CPU time, read 32.0K from disk, written 248.0K to disk.
Nov 28 19:42:45 np0005539279 systemd[1]: Stopped target sshd-keygen.target.
Nov 28 19:42:45 np0005539279 systemd[1]: Stopping sshd-keygen.target...
Nov 28 19:42:45 np0005539279 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 19:42:45 np0005539279 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 19:42:45 np0005539279 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 19:42:45 np0005539279 systemd[1]: Reached target sshd-keygen.target.
Nov 28 19:42:45 np0005539279 systemd[1]: Starting OpenSSH server daemon...
Nov 28 19:42:45 np0005539279 systemd[1]: Started OpenSSH server daemon.
Nov 28 19:42:48 np0005539279 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 19:42:48 np0005539279 systemd[1]: Starting man-db-cache-update.service...
Nov 28 19:42:48 np0005539279 systemd[1]: Reloading.
Nov 28 19:42:48 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:42:48 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:42:48 np0005539279 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 19:42:53 np0005539279 python3.9[133613]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 19:42:53 np0005539279 systemd[1]: Reloading.
Nov 28 19:42:53 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:42:53 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:42:54 np0005539279 python3.9[134857]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 19:42:54 np0005539279 systemd[1]: Reloading.
Nov 28 19:42:54 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:42:54 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:42:55 np0005539279 python3.9[136015]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 19:42:55 np0005539279 systemd[1]: Reloading.
Nov 28 19:42:56 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:42:56 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:42:57 np0005539279 python3.9[137288]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 19:42:57 np0005539279 systemd[1]: Reloading.
Nov 28 19:42:57 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:42:57 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:42:58 np0005539279 python3.9[138387]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:42:58 np0005539279 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 19:42:58 np0005539279 systemd[1]: Finished man-db-cache-update.service.
Nov 28 19:42:58 np0005539279 systemd[1]: man-db-cache-update.service: Consumed 13.252s CPU time.
Nov 28 19:42:58 np0005539279 systemd[1]: run-raad8ca8314a64d57b5efd1a6b91bccf3.service: Deactivated successfully.
Nov 28 19:42:58 np0005539279 systemd[1]: Reloading.
Nov 28 19:42:58 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:42:58 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:42:59 np0005539279 python3.9[138751]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:42:59 np0005539279 systemd[1]: Reloading.
Nov 28 19:42:59 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:42:59 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:43:01 np0005539279 python3.9[138940]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:01 np0005539279 systemd[1]: Reloading.
Nov 28 19:43:01 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:43:01 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:43:02 np0005539279 python3.9[139129]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:03 np0005539279 python3.9[139284]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:03 np0005539279 systemd[1]: Reloading.
Nov 28 19:43:03 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:43:03 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:43:04 np0005539279 python3.9[139474]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 19:43:04 np0005539279 systemd[1]: Reloading.
Nov 28 19:43:05 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:43:05 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:43:05 np0005539279 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 28 19:43:05 np0005539279 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 28 19:43:06 np0005539279 python3.9[139667]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:07 np0005539279 python3.9[139822]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:43:08.071 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:43:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:43:08.073 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:43:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:43:08.073 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:43:08 np0005539279 python3.9[139977]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:08 np0005539279 podman[139979]: 2025-11-29 00:43:08.519077916 +0000 UTC m=+0.146805940 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 19:43:09 np0005539279 python3.9[140159]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:10 np0005539279 python3.9[140314]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:10 np0005539279 podman[140443]: 2025-11-29 00:43:10.981196323 +0000 UTC m=+0.083463750 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 19:43:11 np0005539279 python3.9[140488]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:12 np0005539279 python3.9[140645]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:14 np0005539279 python3.9[140802]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:15 np0005539279 python3.9[140957]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:16 np0005539279 python3.9[141112]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:17 np0005539279 python3.9[141267]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:18 np0005539279 python3.9[141422]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:19 np0005539279 python3.9[141577]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:21 np0005539279 python3.9[141734]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 19:43:22 np0005539279 python3.9[141889]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:43:23 np0005539279 python3.9[142043]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:43:24 np0005539279 python3.9[142195]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:43:25 np0005539279 python3.9[142347]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:43:25 np0005539279 python3.9[142499]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:43:26 np0005539279 python3.9[142651]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:43:27 np0005539279 python3.9[142803]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:28 np0005539279 python3.9[142930]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764377006.902401-554-214489499166721/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:29 np0005539279 python3.9[143082]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:30 np0005539279 python3.9[143207]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764377008.6761427-554-40418899143484/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:30 np0005539279 python3.9[143359]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:31 np0005539279 python3.9[143484]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764377010.3322246-554-70343608010395/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:32 np0005539279 python3.9[143636]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:33 np0005539279 python3.9[143761]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764377011.8436568-554-122240326338116/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:33 np0005539279 python3.9[143913]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:34 np0005539279 python3.9[144038]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764377013.3835258-554-234758141646672/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:35 np0005539279 python3.9[144190]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:36 np0005539279 python3.9[144315]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764377014.9154556-554-115776041053561/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:37 np0005539279 python3.9[144467]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:37 np0005539279 python3.9[144590]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764377016.4117677-554-146141056061389/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:38 np0005539279 python3.9[144742]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:38 np0005539279 podman[144792]: 2025-11-29 00:43:38.885583209 +0000 UTC m=+0.120916099 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 19:43:39 np0005539279 python3.9[144893]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764377017.8734496-554-84453495238393/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:40 np0005539279 python3.9[145045]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 28 19:43:41 np0005539279 python3.9[145198]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:41 np0005539279 podman[145322]: 2025-11-29 00:43:41.687634744 +0000 UTC m=+0.105445501 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:43:41 np0005539279 python3.9[145367]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:42 np0005539279 python3.9[145519]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:43 np0005539279 python3.9[145671]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:44 np0005539279 python3.9[145823]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:45 np0005539279 python3.9[145975]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:45 np0005539279 python3.9[146127]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:46 np0005539279 python3.9[146279]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:47 np0005539279 python3.9[146431]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:48 np0005539279 python3.9[146583]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:49 np0005539279 python3.9[146735]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:49 np0005539279 python3.9[146887]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:50 np0005539279 python3.9[147039]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:51 np0005539279 python3.9[147191]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:52 np0005539279 python3.9[147347]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:53 np0005539279 python3.9[147470]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377031.838579-775-121484776317402/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:53 np0005539279 python3.9[147622]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:54 np0005539279 python3.9[147745]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377033.3987825-775-130417061293120/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:55 np0005539279 python3.9[147897]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:56 np0005539279 python3.9[148020]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377034.9273593-775-207549271130388/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:56 np0005539279 python3.9[148172]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:57 np0005539279 python3.9[148295]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377036.377034-775-269693406027228/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:58 np0005539279 python3.9[148449]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:43:58 np0005539279 python3.9[148572]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377037.7520568-775-181964571394104/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:43:59 np0005539279 python3.9[148724]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:00 np0005539279 python3.9[148847]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377039.1432338-775-203817406676848/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:01 np0005539279 python3.9[148999]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:01 np0005539279 python3.9[149122]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377040.5457823-775-19015346965447/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:02 np0005539279 python3.9[149274]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:03 np0005539279 python3.9[149397]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377041.7969592-775-270914792044084/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:03 np0005539279 python3.9[149549]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:04 np0005539279 python3.9[149672]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377043.2574263-775-64961896102398/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:05 np0005539279 python3.9[149824]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:06 np0005539279 python3.9[149947]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377044.870123-775-78850345076217/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:06 np0005539279 python3.9[150099]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:07 np0005539279 python3.9[150222]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377046.336925-775-83133986545068/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:44:08.072 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:44:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:44:08.073 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:44:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:44:08.073 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:44:08 np0005539279 python3.9[150374]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:09 np0005539279 python3.9[150497]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377047.7349262-775-232585287400996/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:09 np0005539279 podman[150621]: 2025-11-29 00:44:09.79746842 +0000 UTC m=+0.129015559 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 19:44:09 np0005539279 python3.9[150670]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:10 np0005539279 python3.9[150799]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377049.3287442-775-231166524664710/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:11 np0005539279 python3.9[150951]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:11 np0005539279 podman[151027]: 2025-11-29 00:44:11.852009794 +0000 UTC m=+0.089649532 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 19:44:12 np0005539279 python3.9[151093]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377050.8311062-775-17019181127840/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:12 np0005539279 python3.9[151243]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:44:13 np0005539279 python3.9[151398]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 28 19:44:15 np0005539279 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 28 19:44:16 np0005539279 python3.9[151554]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:16 np0005539279 python3.9[151706]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:17 np0005539279 python3.9[151858]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:18 np0005539279 python3.9[152010]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:19 np0005539279 python3.9[152164]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:20 np0005539279 python3.9[152316]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:20 np0005539279 python3.9[152468]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:21 np0005539279 python3.9[152620]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:22 np0005539279 python3.9[152772]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:23 np0005539279 python3.9[152926]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:24 np0005539279 python3.9[153078]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:44:24 np0005539279 systemd[1]: Reloading.
Nov 28 19:44:24 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:44:24 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:44:24 np0005539279 systemd[1]: Starting libvirt logging daemon socket...
Nov 28 19:44:24 np0005539279 systemd[1]: Listening on libvirt logging daemon socket.
Nov 28 19:44:24 np0005539279 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 28 19:44:24 np0005539279 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 28 19:44:24 np0005539279 systemd[1]: Starting libvirt logging daemon...
Nov 28 19:44:24 np0005539279 systemd[1]: Started libvirt logging daemon.
Nov 28 19:44:25 np0005539279 python3.9[153272]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:44:25 np0005539279 systemd[1]: Reloading.
Nov 28 19:44:25 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:44:25 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:44:26 np0005539279 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 28 19:44:26 np0005539279 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 28 19:44:26 np0005539279 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 28 19:44:26 np0005539279 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 28 19:44:26 np0005539279 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 28 19:44:26 np0005539279 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 28 19:44:26 np0005539279 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 28 19:44:26 np0005539279 systemd[1]: Starting libvirt nodedev daemon...
Nov 28 19:44:26 np0005539279 systemd[1]: Started libvirt nodedev daemon.
Nov 28 19:44:26 np0005539279 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 28 19:44:26 np0005539279 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 28 19:44:26 np0005539279 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 28 19:44:27 np0005539279 python3.9[153498]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:44:27 np0005539279 systemd[1]: Reloading.
Nov 28 19:44:27 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:44:27 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:44:27 np0005539279 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 28 19:44:27 np0005539279 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 28 19:44:27 np0005539279 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 28 19:44:27 np0005539279 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 28 19:44:27 np0005539279 systemd[1]: Starting libvirt proxy daemon...
Nov 28 19:44:27 np0005539279 systemd[1]: Started libvirt proxy daemon.
Nov 28 19:44:27 np0005539279 setroubleshoot[153311]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 94b7a843-a625-42d9-80e2-5652a5c237cb
Nov 28 19:44:27 np0005539279 setroubleshoot[153311]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 28 19:44:27 np0005539279 setroubleshoot[153311]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 94b7a843-a625-42d9-80e2-5652a5c237cb
Nov 28 19:44:27 np0005539279 setroubleshoot[153311]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 28 19:44:28 np0005539279 python3.9[153712]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:44:28 np0005539279 systemd[1]: Reloading.
Nov 28 19:44:28 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:44:28 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:44:28 np0005539279 systemd[1]: Listening on libvirt locking daemon socket.
Nov 28 19:44:28 np0005539279 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 28 19:44:28 np0005539279 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 28 19:44:28 np0005539279 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 28 19:44:28 np0005539279 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 28 19:44:28 np0005539279 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 28 19:44:28 np0005539279 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 28 19:44:28 np0005539279 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 28 19:44:28 np0005539279 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 28 19:44:28 np0005539279 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 28 19:44:28 np0005539279 systemd[1]: Starting libvirt QEMU daemon...
Nov 28 19:44:28 np0005539279 systemd[1]: Started libvirt QEMU daemon.
Nov 28 19:44:29 np0005539279 python3.9[153927]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:44:29 np0005539279 systemd[1]: Reloading.
Nov 28 19:44:29 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:44:29 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:44:29 np0005539279 systemd[1]: Starting libvirt secret daemon socket...
Nov 28 19:44:29 np0005539279 systemd[1]: Listening on libvirt secret daemon socket.
Nov 28 19:44:29 np0005539279 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 28 19:44:29 np0005539279 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 28 19:44:29 np0005539279 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 28 19:44:29 np0005539279 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 28 19:44:29 np0005539279 systemd[1]: Starting libvirt secret daemon...
Nov 28 19:44:29 np0005539279 systemd[1]: Started libvirt secret daemon.
Nov 28 19:44:30 np0005539279 python3.9[154138]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:31 np0005539279 python3.9[154290]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 19:44:32 np0005539279 python3.9[154444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:33 np0005539279 python3.9[154569]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377072.176494-1120-25891058811101/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:34 np0005539279 python3.9[154721]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:35 np0005539279 python3.9[154873]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:35 np0005539279 python3.9[154951]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:36 np0005539279 python3.9[155103]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:37 np0005539279 python3.9[155181]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zaknk0a5 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:37 np0005539279 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 28 19:44:37 np0005539279 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 28 19:44:37 np0005539279 python3.9[155334]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:38 np0005539279 python3.9[155412]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:39 np0005539279 python3.9[155564]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:44:40 np0005539279 podman[155689]: 2025-11-29 00:44:40.015701386 +0000 UTC m=+0.116619206 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 19:44:40 np0005539279 python3[155734]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 19:44:40 np0005539279 python3.9[155893]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:41 np0005539279 python3.9[155973]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:42 np0005539279 podman[156097]: 2025-11-29 00:44:42.071537803 +0000 UTC m=+0.071589860 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 19:44:42 np0005539279 python3.9[156142]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:42 np0005539279 python3.9[156220]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:43 np0005539279 python3.9[156372]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:44 np0005539279 python3.9[156450]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:44 np0005539279 python3.9[156602]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:45 np0005539279 python3.9[156682]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:46 np0005539279 python3.9[156834]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:46 np0005539279 python3.9[156959]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377085.5483546-1245-241303618607690/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:47 np0005539279 python3.9[157111]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:48 np0005539279 python3.9[157263]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:44:49 np0005539279 python3.9[157418]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:49 np0005539279 python3.9[157570]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:44:50 np0005539279 python3.9[157723]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:44:51 np0005539279 python3.9[157877]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:44:52 np0005539279 python3.9[158033]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:53 np0005539279 python3.9[158185]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:53 np0005539279 python3.9[158308]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377092.6510696-1317-235077686291197/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:54 np0005539279 python3.9[158460]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:55 np0005539279 python3.9[158583]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377094.0308027-1332-200605611290180/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:55 np0005539279 python3.9[158735]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:44:56 np0005539279 python3.9[158858]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377095.3784359-1347-9544578988054/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:44:57 np0005539279 python3.9[159010]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:44:57 np0005539279 systemd[1]: Reloading.
Nov 28 19:44:57 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:44:57 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:44:57 np0005539279 systemd[1]: Reached target edpm_libvirt.target.
Nov 28 19:44:58 np0005539279 python3.9[159202]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 19:44:58 np0005539279 systemd[1]: Reloading.
Nov 28 19:44:58 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:44:59 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:44:59 np0005539279 systemd[1]: Reloading.
Nov 28 19:44:59 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:44:59 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:44:59 np0005539279 systemd-logind[811]: Session 23 logged out. Waiting for processes to exit.
Nov 28 19:44:59 np0005539279 systemd[1]: session-23.scope: Deactivated successfully.
Nov 28 19:44:59 np0005539279 systemd[1]: session-23.scope: Consumed 3min 54.899s CPU time.
Nov 28 19:44:59 np0005539279 systemd-logind[811]: Removed session 23.
Nov 28 19:45:05 np0005539279 systemd-logind[811]: New session 24 of user zuul.
Nov 28 19:45:05 np0005539279 systemd[1]: Started Session 24 of User zuul.
Nov 28 19:45:06 np0005539279 python3.9[159453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:45:07 np0005539279 python3.9[159607]: ansible-ansible.builtin.service_facts Invoked
Nov 28 19:45:07 np0005539279 network[159624]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 19:45:07 np0005539279 network[159625]: 'network-scripts' will be removed from distribution in near future.
Nov 28 19:45:07 np0005539279 network[159626]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 19:45:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:45:08.075 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:45:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:45:08.076 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:45:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:45:08.076 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:45:10 np0005539279 podman[159665]: 2025-11-29 00:45:10.168727075 +0000 UTC m=+0.099306197 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 19:45:12 np0005539279 podman[159767]: 2025-11-29 00:45:12.442828783 +0000 UTC m=+0.085562012 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 19:45:14 np0005539279 python3.9[159943]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 19:45:15 np0005539279 python3.9[160027]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:45:21 np0005539279 python3.9[160180]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:45:22 np0005539279 python3.9[160332]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:45:23 np0005539279 python3.9[160485]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:45:23 np0005539279 python3.9[160637]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:45:24 np0005539279 python3.9[160790]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:45:25 np0005539279 python3.9[160913]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377124.1580155-95-41028989838431/.source.iscsi _original_basename=.83zvrj3v follow=False checksum=183e283596e2bef1eeec0fd68c4c6e52b82018d4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:26 np0005539279 python3.9[161065]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:27 np0005539279 python3.9[161217]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:27 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 19:45:28 np0005539279 python3.9[161370]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:45:28 np0005539279 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 28 19:45:29 np0005539279 python3.9[161526]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:45:29 np0005539279 systemd[1]: Reloading.
Nov 28 19:45:29 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:45:29 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:45:30 np0005539279 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 28 19:45:30 np0005539279 systemd[1]: Starting Open-iSCSI...
Nov 28 19:45:30 np0005539279 kernel: Loading iSCSI transport class v2.0-870.
Nov 28 19:45:30 np0005539279 systemd[1]: Started Open-iSCSI.
Nov 28 19:45:30 np0005539279 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 28 19:45:30 np0005539279 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 28 19:45:31 np0005539279 python3.9[161727]: ansible-ansible.builtin.service_facts Invoked
Nov 28 19:45:31 np0005539279 network[161744]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 19:45:31 np0005539279 network[161745]: 'network-scripts' will be removed from distribution in near future.
Nov 28 19:45:31 np0005539279 network[161746]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 19:45:36 np0005539279 python3.9[162017]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 19:45:37 np0005539279 python3.9[162169]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 28 19:45:37 np0005539279 python3.9[162325]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:45:38 np0005539279 python3.9[162448]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377137.3703198-172-50050082425317/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:39 np0005539279 python3.9[162600]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:40 np0005539279 podman[162754]: 2025-11-29 00:45:40.412295603 +0000 UTC m=+0.128910341 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 19:45:40 np0005539279 python3.9[162755]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:45:40 np0005539279 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 19:45:40 np0005539279 systemd[1]: Stopped Load Kernel Modules.
Nov 28 19:45:40 np0005539279 systemd[1]: Stopping Load Kernel Modules...
Nov 28 19:45:40 np0005539279 systemd[1]: Starting Load Kernel Modules...
Nov 28 19:45:40 np0005539279 systemd[1]: Finished Load Kernel Modules.
Nov 28 19:45:41 np0005539279 python3.9[162938]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:45:42 np0005539279 python3.9[163092]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:45:42 np0005539279 podman[163169]: 2025-11-29 00:45:42.834780582 +0000 UTC m=+0.078711051 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:45:43 np0005539279 python3.9[163262]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:45:44 np0005539279 python3.9[163414]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:45:44 np0005539279 python3.9[163539]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377143.4366884-230-107098366353152/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:45 np0005539279 python3.9[163691]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:45:46 np0005539279 python3.9[163844]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:47 np0005539279 python3.9[163996]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:48 np0005539279 python3.9[164148]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:49 np0005539279 python3.9[164300]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:49 np0005539279 python3.9[164452]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:50 np0005539279 python3.9[164604]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:51 np0005539279 python3.9[164756]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:52 np0005539279 python3.9[164908]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:45:52 np0005539279 python3.9[165062]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:53 np0005539279 python3.9[165214]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:45:54 np0005539279 python3.9[165366]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:45:55 np0005539279 python3.9[165444]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:45:56 np0005539279 python3.9[165596]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:45:56 np0005539279 python3.9[165674]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:45:57 np0005539279 python3.9[165826]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:58 np0005539279 python3.9[165978]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:45:58 np0005539279 python3.9[166056]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:45:59 np0005539279 python3.9[166208]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:46:00 np0005539279 python3.9[166286]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:00 np0005539279 python3.9[166440]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:01 np0005539279 systemd[1]: Reloading.
Nov 28 19:46:01 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:46:01 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:46:02 np0005539279 python3.9[166629]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:46:02 np0005539279 python3.9[166707]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:03 np0005539279 python3.9[166859]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:46:04 np0005539279 python3.9[166937]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:05 np0005539279 python3.9[167091]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:05 np0005539279 systemd[1]: Reloading.
Nov 28 19:46:05 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:46:05 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:46:05 np0005539279 systemd[1]: Starting Create netns directory...
Nov 28 19:46:05 np0005539279 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 19:46:05 np0005539279 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 19:46:05 np0005539279 systemd[1]: Finished Create netns directory.
Nov 28 19:46:06 np0005539279 python3.9[167289]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:46:07 np0005539279 python3.9[167441]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:46:07 np0005539279 python3.9[167566]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377166.7069428-437-204879531972282/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:46:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:46:08.075 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:46:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:46:08.076 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:46:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:46:08.077 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:46:09 np0005539279 python3.9[167718]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:46:09 np0005539279 python3.9[167870]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:46:10 np0005539279 python3.9[167993]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377169.3475838-462-61425732708389/.source.json _original_basename=.6x6h_ru_ follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:10 np0005539279 podman[168017]: 2025-11-29 00:46:10.869762974 +0000 UTC m=+0.116622831 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 19:46:12 np0005539279 python3.9[168171]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:13 np0005539279 podman[168297]: 2025-11-29 00:46:13.394872185 +0000 UTC m=+0.082803884 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:46:15 np0005539279 python3.9[168622]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 28 19:46:16 np0005539279 python3.9[168774]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:46:17 np0005539279 python3.9[168926]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 19:46:19 np0005539279 python3[169106]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:46:19 np0005539279 podman[169142]: 2025-11-29 00:46:19.944197889 +0000 UTC m=+0.082882935 container create b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 19:46:19 np0005539279 podman[169142]: 2025-11-29 00:46:19.906042912 +0000 UTC m=+0.044728028 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 19:46:19 np0005539279 python3[169106]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 19:46:20 np0005539279 python3.9[169332]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:46:22 np0005539279 python3.9[169486]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:22 np0005539279 python3.9[169562]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:46:23 np0005539279 python3.9[169713]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764377182.6214297-550-138275402282177/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:24 np0005539279 python3.9[169789]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:46:24 np0005539279 systemd[1]: Reloading.
Nov 28 19:46:24 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:46:24 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:46:25 np0005539279 python3.9[169899]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:25 np0005539279 systemd[1]: Reloading.
Nov 28 19:46:25 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:46:25 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:46:25 np0005539279 systemd[1]: Starting multipathd container...
Nov 28 19:46:25 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:46:25 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/914929d6c9925ca315075a0f608e9e236503fba79802f0786c52d23a91abf032/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 19:46:25 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/914929d6c9925ca315075a0f608e9e236503fba79802f0786c52d23a91abf032/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 19:46:25 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0.
Nov 28 19:46:25 np0005539279 podman[169939]: 2025-11-29 00:46:25.821455107 +0000 UTC m=+0.176538390 container init b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 19:46:25 np0005539279 multipathd[169954]: + sudo -E kolla_set_configs
Nov 28 19:46:25 np0005539279 podman[169939]: 2025-11-29 00:46:25.860155471 +0000 UTC m=+0.215238674 container start b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 19:46:25 np0005539279 podman[169939]: multipathd
Nov 28 19:46:25 np0005539279 systemd[1]: Started multipathd container.
Nov 28 19:46:25 np0005539279 multipathd[169954]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 19:46:25 np0005539279 multipathd[169954]: INFO:__main__:Validating config file
Nov 28 19:46:25 np0005539279 multipathd[169954]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 19:46:25 np0005539279 multipathd[169954]: INFO:__main__:Writing out command to execute
Nov 28 19:46:25 np0005539279 podman[169961]: 2025-11-29 00:46:25.946614949 +0000 UTC m=+0.073732403 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 19:46:25 np0005539279 multipathd[169954]: ++ cat /run_command
Nov 28 19:46:25 np0005539279 multipathd[169954]: + CMD='/usr/sbin/multipathd -d'
Nov 28 19:46:25 np0005539279 multipathd[169954]: + ARGS=
Nov 28 19:46:25 np0005539279 multipathd[169954]: + sudo kolla_copy_cacerts
Nov 28 19:46:25 np0005539279 systemd[1]: b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0-4e9bea3a86658029.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 19:46:25 np0005539279 systemd[1]: b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0-4e9bea3a86658029.service: Failed with result 'exit-code'.
Nov 28 19:46:25 np0005539279 multipathd[169954]: + [[ ! -n '' ]]
Nov 28 19:46:25 np0005539279 multipathd[169954]: + . kolla_extend_start
Nov 28 19:46:25 np0005539279 multipathd[169954]: Running command: '/usr/sbin/multipathd -d'
Nov 28 19:46:25 np0005539279 multipathd[169954]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 19:46:25 np0005539279 multipathd[169954]: + umask 0022
Nov 28 19:46:25 np0005539279 multipathd[169954]: + exec /usr/sbin/multipathd -d
Nov 28 19:46:25 np0005539279 multipathd[169954]: 3119.750360 | --------start up--------
Nov 28 19:46:25 np0005539279 multipathd[169954]: 3119.750377 | read /etc/multipath.conf
Nov 28 19:46:25 np0005539279 multipathd[169954]: 3119.757653 | path checkers start up
Nov 28 19:46:26 np0005539279 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 28 19:46:26 np0005539279 python3.9[170143]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:46:27 np0005539279 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 19:46:27 np0005539279 python3.9[170297]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:46:28 np0005539279 python3.9[170463]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:46:28 np0005539279 systemd[1]: Stopping multipathd container...
Nov 28 19:46:28 np0005539279 multipathd[169954]: 3122.464587 | exit (signal)
Nov 28 19:46:28 np0005539279 multipathd[169954]: 3122.466412 | --------shut down-------
Nov 28 19:46:28 np0005539279 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 28 19:46:28 np0005539279 systemd[1]: libpod-b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0.scope: Deactivated successfully.
Nov 28 19:46:28 np0005539279 podman[170467]: 2025-11-29 00:46:28.747566291 +0000 UTC m=+0.102148560 container died b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 19:46:28 np0005539279 systemd[1]: b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0-4e9bea3a86658029.timer: Deactivated successfully.
Nov 28 19:46:28 np0005539279 systemd[1]: Stopped /usr/bin/podman healthcheck run b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0.
Nov 28 19:46:28 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0-userdata-shm.mount: Deactivated successfully.
Nov 28 19:46:28 np0005539279 systemd[1]: var-lib-containers-storage-overlay-914929d6c9925ca315075a0f608e9e236503fba79802f0786c52d23a91abf032-merged.mount: Deactivated successfully.
Nov 28 19:46:28 np0005539279 podman[170467]: 2025-11-29 00:46:28.848030251 +0000 UTC m=+0.202612540 container cleanup b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 19:46:28 np0005539279 podman[170467]: multipathd
Nov 28 19:46:28 np0005539279 podman[170497]: multipathd
Nov 28 19:46:28 np0005539279 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 28 19:46:28 np0005539279 systemd[1]: Stopped multipathd container.
Nov 28 19:46:28 np0005539279 systemd[1]: Starting multipathd container...
Nov 28 19:46:29 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:46:29 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/914929d6c9925ca315075a0f608e9e236503fba79802f0786c52d23a91abf032/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 19:46:29 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/914929d6c9925ca315075a0f608e9e236503fba79802f0786c52d23a91abf032/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 19:46:29 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0.
Nov 28 19:46:29 np0005539279 podman[170510]: 2025-11-29 00:46:29.125392102 +0000 UTC m=+0.146162677 container init b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:46:29 np0005539279 multipathd[170525]: + sudo -E kolla_set_configs
Nov 28 19:46:29 np0005539279 podman[170510]: 2025-11-29 00:46:29.156528528 +0000 UTC m=+0.177299013 container start b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 19:46:29 np0005539279 podman[170510]: multipathd
Nov 28 19:46:29 np0005539279 systemd[1]: Started multipathd container.
Nov 28 19:46:29 np0005539279 multipathd[170525]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 19:46:29 np0005539279 multipathd[170525]: INFO:__main__:Validating config file
Nov 28 19:46:29 np0005539279 multipathd[170525]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 19:46:29 np0005539279 multipathd[170525]: INFO:__main__:Writing out command to execute
Nov 28 19:46:29 np0005539279 multipathd[170525]: ++ cat /run_command
Nov 28 19:46:29 np0005539279 multipathd[170525]: + CMD='/usr/sbin/multipathd -d'
Nov 28 19:46:29 np0005539279 multipathd[170525]: + ARGS=
Nov 28 19:46:29 np0005539279 multipathd[170525]: + sudo kolla_copy_cacerts
Nov 28 19:46:29 np0005539279 podman[170532]: 2025-11-29 00:46:29.218163151 +0000 UTC m=+0.051055290 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 19:46:29 np0005539279 systemd[1]: b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0-30f7ce4b33826daa.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 19:46:29 np0005539279 systemd[1]: b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0-30f7ce4b33826daa.service: Failed with result 'exit-code'.
Nov 28 19:46:29 np0005539279 multipathd[170525]: Running command: '/usr/sbin/multipathd -d'
Nov 28 19:46:29 np0005539279 multipathd[170525]: + [[ ! -n '' ]]
Nov 28 19:46:29 np0005539279 multipathd[170525]: + . kolla_extend_start
Nov 28 19:46:29 np0005539279 multipathd[170525]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 19:46:29 np0005539279 multipathd[170525]: + umask 0022
Nov 28 19:46:29 np0005539279 multipathd[170525]: + exec /usr/sbin/multipathd -d
Nov 28 19:46:29 np0005539279 multipathd[170525]: 3123.026859 | --------start up--------
Nov 28 19:46:29 np0005539279 multipathd[170525]: 3123.026885 | read /etc/multipath.conf
Nov 28 19:46:29 np0005539279 multipathd[170525]: 3123.034872 | path checkers start up
Nov 28 19:46:29 np0005539279 python3.9[170716]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:30 np0005539279 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 28 19:46:30 np0005539279 python3.9[170869]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 19:46:31 np0005539279 python3.9[171021]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 28 19:46:31 np0005539279 kernel: Key type psk registered
Nov 28 19:46:32 np0005539279 python3.9[171184]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:46:33 np0005539279 python3.9[171307]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377192.1396725-630-76110975458120/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:34 np0005539279 python3.9[171459]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:35 np0005539279 python3.9[171611]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:46:35 np0005539279 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 19:46:35 np0005539279 systemd[1]: Stopped Load Kernel Modules.
Nov 28 19:46:35 np0005539279 systemd[1]: Stopping Load Kernel Modules...
Nov 28 19:46:35 np0005539279 systemd[1]: Starting Load Kernel Modules...
Nov 28 19:46:35 np0005539279 systemd[1]: Finished Load Kernel Modules.
Nov 28 19:46:36 np0005539279 python3.9[171767]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 19:46:38 np0005539279 systemd[1]: Reloading.
Nov 28 19:46:38 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:46:38 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:46:39 np0005539279 systemd[1]: Reloading.
Nov 28 19:46:39 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:46:39 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:46:39 np0005539279 systemd-logind[811]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 19:46:39 np0005539279 systemd-logind[811]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 19:46:39 np0005539279 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 19:46:39 np0005539279 systemd[1]: Starting man-db-cache-update.service...
Nov 28 19:46:39 np0005539279 systemd[1]: Reloading.
Nov 28 19:46:39 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:46:39 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:46:40 np0005539279 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 19:46:41 np0005539279 podman[173094]: 2025-11-29 00:46:41.099246848 +0000 UTC m=+0.118012036 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:46:41 np0005539279 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 19:46:41 np0005539279 systemd[1]: Finished man-db-cache-update.service.
Nov 28 19:46:41 np0005539279 systemd[1]: man-db-cache-update.service: Consumed 1.710s CPU time.
Nov 28 19:46:41 np0005539279 systemd[1]: run-rf12703508c7e437092fbdb64caf474fb.service: Deactivated successfully.
Nov 28 19:46:41 np0005539279 python3.9[173197]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:46:41 np0005539279 systemd[1]: Stopping Open-iSCSI...
Nov 28 19:46:41 np0005539279 iscsid[161566]: iscsid shutting down.
Nov 28 19:46:41 np0005539279 systemd[1]: iscsid.service: Deactivated successfully.
Nov 28 19:46:41 np0005539279 systemd[1]: Stopped Open-iSCSI.
Nov 28 19:46:41 np0005539279 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 28 19:46:41 np0005539279 systemd[1]: Starting Open-iSCSI...
Nov 28 19:46:41 np0005539279 systemd[1]: Started Open-iSCSI.
Nov 28 19:46:42 np0005539279 python3.9[173401]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:46:43 np0005539279 python3.9[173557]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:46:43 np0005539279 podman[173582]: 2025-11-29 00:46:43.855938008 +0000 UTC m=+0.094209872 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent)
Nov 28 19:46:44 np0005539279 python3.9[173731]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:46:44 np0005539279 systemd[1]: Reloading.
Nov 28 19:46:44 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:46:44 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:46:45 np0005539279 python3.9[173917]: ansible-ansible.builtin.service_facts Invoked
Nov 28 19:46:46 np0005539279 network[173934]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 19:46:46 np0005539279 network[173935]: 'network-scripts' will be removed from distribution in near future.
Nov 28 19:46:46 np0005539279 network[173936]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 19:46:51 np0005539279 python3.9[174212]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:52 np0005539279 python3.9[174367]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:53 np0005539279 python3.9[174520]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:54 np0005539279 python3.9[174675]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:55 np0005539279 python3.9[174828]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:56 np0005539279 python3.9[174983]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:57 np0005539279 python3.9[175136]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:58 np0005539279 python3.9[175289]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:46:59 np0005539279 podman[175442]: 2025-11-29 00:46:59.395713947 +0000 UTC m=+0.122253019 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 19:46:59 np0005539279 python3.9[175443]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:00 np0005539279 python3.9[175615]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:00 np0005539279 python3.9[175767]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:01 np0005539279 python3.9[175919]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:02 np0005539279 python3.9[176071]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:03 np0005539279 python3.9[176223]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:04 np0005539279 python3.9[176377]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:04 np0005539279 python3.9[176529]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:05 np0005539279 python3.9[176681]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:06 np0005539279 python3.9[176833]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:07 np0005539279 python3.9[176985]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:08 np0005539279 python3.9[177137]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:47:08.076 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:47:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:47:08.077 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:47:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:47:08.077 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:47:08 np0005539279 python3.9[177289]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:09 np0005539279 python3.9[177443]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:10 np0005539279 python3.9[177595]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:11 np0005539279 python3.9[177747]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:11 np0005539279 podman[177868]: 2025-11-29 00:47:11.92580531 +0000 UTC m=+0.170656741 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 19:47:12 np0005539279 python3.9[177921]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:13 np0005539279 python3.9[178077]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 19:47:14 np0005539279 python3.9[178229]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:47:14 np0005539279 systemd[1]: Reloading.
Nov 28 19:47:14 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:47:14 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:47:14 np0005539279 podman[178231]: 2025-11-29 00:47:14.334700982 +0000 UTC m=+0.117020009 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Nov 28 19:47:15 np0005539279 python3.9[178435]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:16 np0005539279 python3.9[178588]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:16 np0005539279 python3.9[178741]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:17 np0005539279 python3.9[178894]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:18 np0005539279 python3.9[179049]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:19 np0005539279 python3.9[179202]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:20 np0005539279 python3.9[179355]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:20 np0005539279 python3.9[179508]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:47:22 np0005539279 python3.9[179661]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:23 np0005539279 python3.9[179813]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:24 np0005539279 python3.9[179965]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:25 np0005539279 python3.9[180117]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:26 np0005539279 python3.9[180269]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:26 np0005539279 python3.9[180421]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:27 np0005539279 python3.9[180573]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:28 np0005539279 python3.9[180729]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:29 np0005539279 python3.9[180881]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:29 np0005539279 podman[180952]: 2025-11-29 00:47:29.637880147 +0000 UTC m=+0.096805447 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 19:47:30 np0005539279 python3.9[181053]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:35 np0005539279 python3.9[181209]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 28 19:47:36 np0005539279 python3.9[181362]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 19:47:37 np0005539279 python3.9[181520]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 19:47:38 np0005539279 systemd-logind[811]: New session 25 of user zuul.
Nov 28 19:47:38 np0005539279 systemd[1]: Started Session 25 of User zuul.
Nov 28 19:47:38 np0005539279 systemd[1]: session-25.scope: Deactivated successfully.
Nov 28 19:47:38 np0005539279 systemd-logind[811]: Session 25 logged out. Waiting for processes to exit.
Nov 28 19:47:38 np0005539279 systemd-logind[811]: Removed session 25.
Nov 28 19:47:39 np0005539279 python3.9[181706]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:40 np0005539279 python3.9[181827]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377258.7613366-1229-174648206459943/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:40 np0005539279 python3.9[181977]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:41 np0005539279 python3.9[182053]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:41 np0005539279 python3.9[182203]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:42 np0005539279 podman[182298]: 2025-11-29 00:47:42.647225282 +0000 UTC m=+0.167982993 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller)
Nov 28 19:47:42 np0005539279 python3.9[182333]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377261.3940089-1229-277658465989968/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:43 np0005539279 python3.9[182498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:44 np0005539279 python3.9[182619]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377262.9002373-1229-128890115027442/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:44 np0005539279 podman[182743]: 2025-11-29 00:47:44.690266644 +0000 UTC m=+0.082964649 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 19:47:44 np0005539279 python3.9[182780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:45 np0005539279 python3.9[182907]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377264.277146-1229-119111922466150/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:46 np0005539279 python3.9[183057]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:46 np0005539279 python3.9[183178]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377265.719671-1229-153336742381589/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:47 np0005539279 python3.9[183330]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:48 np0005539279 python3.9[183482]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:47:49 np0005539279 python3.9[183634]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:47:50 np0005539279 python3.9[183786]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:51 np0005539279 python3.9[183909]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764377269.7890184-1336-18298025869726/.source _original_basename=.uvsp49l4 follow=False checksum=d81efee7c7c53bbf60701102cd1e909f7ec47689 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 28 19:47:51 np0005539279 python3.9[184061]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:47:52 np0005539279 python3.9[184213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:53 np0005539279 python3.9[184334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377272.2411063-1362-135334700970342/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:54 np0005539279 python3.9[184484]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:47:54 np0005539279 python3.9[184607]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377273.6955116-1377-247078807237365/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:47:55 np0005539279 python3.9[184759]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 28 19:47:56 np0005539279 python3.9[184911]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:47:57 np0005539279 python3[185063]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:47:57 np0005539279 podman[185100]: 2025-11-29 00:47:57.938464199 +0000 UTC m=+0.048630070 container create eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20251125, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm)
Nov 28 19:47:57 np0005539279 podman[185100]: 2025-11-29 00:47:57.913446635 +0000 UTC m=+0.023612536 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 19:47:57 np0005539279 python3[185063]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 28 19:47:58 np0005539279 python3.9[185290]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:47:59 np0005539279 podman[185444]: 2025-11-29 00:47:59.845023918 +0000 UTC m=+0.094362991 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:47:59 np0005539279 python3.9[185445]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 28 19:48:00 np0005539279 python3.9[185617]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:48:01 np0005539279 python3[185769]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:48:02 np0005539279 podman[185807]: 2025-11-29 00:48:02.023671356 +0000 UTC m=+0.061539483 container create f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:48:02 np0005539279 podman[185807]: 2025-11-29 00:48:01.987516888 +0000 UTC m=+0.025385105 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 19:48:02 np0005539279 python3[185769]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 28 19:48:03 np0005539279 python3.9[185997]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:48:03 np0005539279 python3.9[186151]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:04 np0005539279 python3.9[186304]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764377284.026755-1469-151942764952447/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:05 np0005539279 python3.9[186380]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:48:05 np0005539279 systemd[1]: Reloading.
Nov 28 19:48:05 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:48:05 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:48:06 np0005539279 python3.9[186491]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:48:07 np0005539279 systemd[1]: Reloading.
Nov 28 19:48:07 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:48:07 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:48:07 np0005539279 systemd[1]: Starting nova_compute container...
Nov 28 19:48:07 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:48:08 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:08 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:08 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:08 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:08 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:08 np0005539279 podman[186531]: 2025-11-29 00:48:08.029972729 +0000 UTC m=+0.151336408 container init f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 19:48:08 np0005539279 podman[186531]: 2025-11-29 00:48:08.045674938 +0000 UTC m=+0.167038587 container start f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 19:48:08 np0005539279 podman[186531]: nova_compute
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + sudo -E kolla_set_configs
Nov 28 19:48:08 np0005539279 systemd[1]: Started nova_compute container.
Nov 28 19:48:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:48:08.077 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:48:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:48:08.079 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:48:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:48:08.079 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Validating config file
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying service configuration files
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Deleting /etc/ceph
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Creating directory /etc/ceph
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Writing out command to execute
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 19:48:08 np0005539279 nova_compute[186546]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 19:48:08 np0005539279 nova_compute[186546]: ++ cat /run_command
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + CMD=nova-compute
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + ARGS=
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + sudo kolla_copy_cacerts
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + [[ ! -n '' ]]
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + . kolla_extend_start
Nov 28 19:48:08 np0005539279 nova_compute[186546]: Running command: 'nova-compute'
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + umask 0022
Nov 28 19:48:08 np0005539279 nova_compute[186546]: + exec nova-compute
Nov 28 19:48:09 np0005539279 python3.9[186710]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:48:10 np0005539279 nova_compute[186546]: 2025-11-29 00:48:10.119 186552 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 19:48:10 np0005539279 nova_compute[186546]: 2025-11-29 00:48:10.120 186552 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 19:48:10 np0005539279 nova_compute[186546]: 2025-11-29 00:48:10.120 186552 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 19:48:10 np0005539279 nova_compute[186546]: 2025-11-29 00:48:10.120 186552 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 28 19:48:10 np0005539279 python3.9[186860]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:48:10 np0005539279 nova_compute[186546]: 2025-11-29 00:48:10.246 186552 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:48:10 np0005539279 nova_compute[186546]: 2025-11-29 00:48:10.267 186552 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:48:10 np0005539279 nova_compute[186546]: 2025-11-29 00:48:10.268 186552 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 28 19:48:10 np0005539279 nova_compute[186546]: 2025-11-29 00:48:10.921 186552 INFO nova.virt.driver [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.061 186552 INFO nova.compute.provider_config [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 28 19:48:11 np0005539279 python3.9[187014]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.241 186552 DEBUG oslo_concurrency.lockutils [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.242 186552 DEBUG oslo_concurrency.lockutils [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.243 186552 DEBUG oslo_concurrency.lockutils [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.243 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.244 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.244 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.244 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.245 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.245 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.245 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.245 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.246 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.246 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.246 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.247 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.247 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.248 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.248 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.248 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.249 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.249 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.249 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.250 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.250 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.250 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.251 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.251 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.252 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.252 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.252 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.253 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.254 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.254 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.254 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.255 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.255 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.255 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.256 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.256 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.257 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.257 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.258 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.258 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.258 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.259 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.259 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.259 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.260 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.260 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.260 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.261 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.261 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.261 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.262 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.262 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.263 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.263 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.263 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.263 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.264 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.264 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.264 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.265 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.265 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.266 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.266 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.266 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.267 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.267 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.267 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.268 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.268 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.268 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.269 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.269 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.269 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.270 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.270 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.270 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.270 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.271 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.271 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.272 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.272 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.272 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.273 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.273 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.273 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.274 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.274 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.275 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.275 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.275 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.275 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.275 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.276 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.276 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.276 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.276 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.276 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.277 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.277 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.277 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.277 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.277 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.278 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.278 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.278 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.278 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.279 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.279 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.279 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.279 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.279 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.280 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.280 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.280 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.280 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.280 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.281 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.281 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.281 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.281 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.281 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.282 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.282 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.282 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.282 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.282 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.283 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.283 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.283 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.283 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.283 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.284 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.284 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.284 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.284 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.285 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.285 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.285 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.285 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.285 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.286 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.286 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.286 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.286 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.286 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.287 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.287 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.287 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.287 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.287 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.288 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.288 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.288 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.289 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.289 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.289 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.289 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.290 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.290 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.290 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.291 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.291 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.291 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.291 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.292 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.292 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.292 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.293 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.293 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.293 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.293 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.294 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.294 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.294 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.294 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.295 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.295 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.295 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.295 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.296 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.296 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.296 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.296 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.297 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.297 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.297 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.298 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.298 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.298 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.298 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.299 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.299 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.299 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.300 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.300 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.300 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.300 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.301 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.301 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.301 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.301 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.302 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.302 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.302 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.302 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.303 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.303 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.303 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.303 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.303 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.304 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.304 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.304 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.304 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.305 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.305 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.305 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.305 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.305 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.306 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.306 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.306 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.306 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.307 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.307 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.307 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.307 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.307 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.308 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.308 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.308 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.308 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.308 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.309 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.309 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.309 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.309 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.309 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.310 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.310 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.310 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.310 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.311 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.311 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.311 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.311 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.311 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.312 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.312 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.312 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.312 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.313 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.313 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.313 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.313 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.314 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.314 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.314 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.315 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.315 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.315 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.315 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.316 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.316 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.316 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.316 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.316 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.316 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.317 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.317 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.317 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.317 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.317 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.317 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.317 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.318 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.318 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.318 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.318 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.318 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.318 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.318 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.319 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.319 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.319 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.319 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.319 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.319 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.320 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.320 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.320 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.320 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.320 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.320 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.321 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.321 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.321 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.321 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.321 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.322 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.322 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.322 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.322 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.322 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.322 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.323 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.323 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.323 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.323 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.323 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.324 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.324 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.324 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.324 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.324 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.325 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.325 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.325 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.325 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.325 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.325 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.326 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.326 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.326 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.326 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.326 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.327 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.327 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.327 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.327 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.327 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.327 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.328 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.328 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.328 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.328 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.328 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.329 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.329 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.329 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.329 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.329 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.330 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.330 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.330 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.330 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.330 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.331 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.331 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.331 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.331 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.331 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.331 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.332 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.332 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.332 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.332 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.333 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.333 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.333 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.333 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.333 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.334 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.334 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.334 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.334 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.334 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.335 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.335 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.335 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.335 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.335 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.335 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.335 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.336 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.336 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.336 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.336 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.336 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.336 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.337 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.337 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.337 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.337 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.337 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.337 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.337 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.338 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.338 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.338 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.338 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.338 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.338 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.338 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.339 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.339 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.339 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.339 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.339 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.339 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.340 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.340 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.340 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.340 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.340 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.340 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.340 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.341 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.341 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.341 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.341 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.341 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.341 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.341 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.342 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.342 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.342 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.342 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.342 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.342 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.343 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.343 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.343 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.343 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.343 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.343 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.343 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.344 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.344 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.344 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.344 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.344 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.344 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.345 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.345 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.345 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.345 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.345 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.345 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.345 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.346 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.346 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.346 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.346 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.346 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.346 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.346 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.347 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.347 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.347 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.347 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.347 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.348 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.348 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.348 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.348 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.348 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.348 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.348 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.349 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.349 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.349 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.349 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.349 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.349 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.350 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.350 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.350 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.350 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.350 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.350 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.350 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.351 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.351 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.351 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.351 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.351 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.351 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.352 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.352 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.352 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.352 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.352 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.352 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.352 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.352 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.353 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.353 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.353 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.353 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.353 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.353 186552 WARNING oslo_config.cfg [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 19:48:11 np0005539279 nova_compute[186546]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 19:48:11 np0005539279 nova_compute[186546]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 19:48:11 np0005539279 nova_compute[186546]: and ``live_migration_inbound_addr`` respectively.
Nov 28 19:48:11 np0005539279 nova_compute[186546]: ).  Its value may be silently ignored in the future.#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.354 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.354 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.354 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.354 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.354 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.354 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.355 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.355 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.355 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.355 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.355 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.355 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.355 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.356 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.356 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.356 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.356 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.356 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.356 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.356 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.357 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.357 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.357 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.357 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.357 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.357 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.357 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.358 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.358 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.358 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.358 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.358 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.358 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.359 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.359 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.359 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.359 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.359 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.359 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.359 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.360 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.360 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.360 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.360 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.360 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.360 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.360 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.361 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.361 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.361 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.361 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.361 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.361 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.362 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.362 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.362 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.362 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.362 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.362 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.362 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.362 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.363 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.363 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.363 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.363 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.363 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.363 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.363 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.364 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.364 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.364 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.364 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.364 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.364 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.364 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.365 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.365 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.365 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.365 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.365 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.365 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.365 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.366 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.366 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.366 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.366 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.366 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.367 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.367 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.367 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.367 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.367 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.367 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.368 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.368 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.368 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.368 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.368 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.368 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.369 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.369 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.369 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.369 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.369 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.369 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.369 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.370 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.370 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.370 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.370 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.370 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.370 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.370 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.371 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.371 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.371 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.371 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.371 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.371 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.371 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.372 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.372 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.372 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.372 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.372 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.372 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.372 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.373 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.373 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.373 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.373 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.373 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.373 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.373 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.374 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.374 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.374 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.374 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.374 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.374 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.375 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.375 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.375 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.375 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.375 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.375 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.375 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.376 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.376 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.376 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.376 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.376 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.376 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.376 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.377 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.377 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.377 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.377 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.377 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.377 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.377 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.378 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.378 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.378 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.378 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.378 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.378 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.378 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.379 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.379 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.379 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.379 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.379 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.379 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.379 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.380 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.380 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.380 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.380 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.380 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.380 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.380 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.381 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.381 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.381 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.381 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.381 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.381 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.381 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.382 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.382 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.382 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.382 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.382 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.382 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.382 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.383 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.383 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.383 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.383 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.383 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.383 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.383 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.384 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.384 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.384 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.384 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.384 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.384 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.384 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.385 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.385 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.385 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.385 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.385 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.385 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.385 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.386 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.386 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.386 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.386 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.386 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.386 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.386 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.387 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.387 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.387 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.387 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.387 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.387 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.387 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.388 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.388 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.388 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.388 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.388 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.388 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.388 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.389 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.389 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.389 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.389 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.389 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.389 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.389 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.390 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.390 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.390 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.390 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.391 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.391 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.391 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.391 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.391 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.391 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.392 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.392 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.392 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.392 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.392 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.392 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.393 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.393 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.393 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.393 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.393 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.393 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.393 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.394 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.394 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.394 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.394 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.394 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.394 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.394 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.395 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.395 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.395 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.395 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.395 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.395 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.395 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.396 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.396 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.396 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.396 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.396 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.396 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.396 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.397 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.397 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.397 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.397 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.397 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.397 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.398 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.398 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.398 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.398 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.398 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.398 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.398 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.399 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.399 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.399 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.399 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.399 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.399 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.399 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.400 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.400 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.400 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.400 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.400 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.400 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.400 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.401 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.401 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.401 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.401 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.401 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.401 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.402 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.402 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.402 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.402 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.402 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.402 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.402 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.403 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.403 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.403 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.403 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.403 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.403 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.403 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.404 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.404 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.404 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.404 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.404 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.404 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.404 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.405 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.405 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.405 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.405 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.405 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.405 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.405 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.406 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.406 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.406 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.406 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.406 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.406 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.406 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.407 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.407 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.407 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.407 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.407 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.407 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.407 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.408 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.408 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.408 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.408 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.408 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.408 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.408 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.408 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.409 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.409 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.409 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.409 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.409 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.409 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.410 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.410 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.410 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.410 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.410 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.410 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.410 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.411 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.411 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.411 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.411 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.411 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.411 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.412 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.412 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.412 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.412 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.412 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.412 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.412 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.413 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.413 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.413 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.413 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.413 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.414 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.414 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.414 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.414 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.414 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.415 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.415 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.415 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.415 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.415 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.415 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.416 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.416 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.416 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.416 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.416 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.416 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.416 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.417 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.417 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.417 186552 DEBUG oslo_service.service [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.418 186552 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.474 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.475 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.475 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.475 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 28 19:48:11 np0005539279 systemd[1]: Starting libvirt QEMU daemon...
Nov 28 19:48:11 np0005539279 systemd[1]: Started libvirt QEMU daemon.
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.556 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f3e02e9cd60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.560 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f3e02e9cd60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.561 186552 INFO nova.virt.libvirt.driver [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.618 186552 WARNING nova.virt.libvirt.driver [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Nov 28 19:48:11 np0005539279 nova_compute[186546]: 2025-11-29 00:48:11.619 186552 DEBUG nova.virt.libvirt.volume.mount [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 28 19:48:12 np0005539279 python3.9[187218]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 19:48:12 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 19:48:12 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.507 186552 INFO nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <host>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <uuid>0b852fc4-ac1f-4ef6-847b-925a46032b4e</uuid>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <arch>x86_64</arch>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model>EPYC-Rome-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <vendor>AMD</vendor>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <microcode version='16777317'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <signature family='23' model='49' stepping='0'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='x2apic'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='tsc-deadline'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='osxsave'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='hypervisor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='tsc_adjust'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='spec-ctrl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='stibp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='arch-capabilities'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='cmp_legacy'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='topoext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='virt-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='lbrv'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='tsc-scale'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='vmcb-clean'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='pause-filter'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='pfthreshold'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='svme-addr-chk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='rdctl-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='skip-l1dfl-vmentry'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='mds-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature name='pschange-mc-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <pages unit='KiB' size='4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <pages unit='KiB' size='2048'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <pages unit='KiB' size='1048576'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <power_management>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <suspend_mem/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <suspend_disk/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <suspend_hybrid/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </power_management>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <iommu support='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <migration_features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <live/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <uri_transports>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <uri_transport>tcp</uri_transport>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <uri_transport>rdma</uri_transport>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </uri_transports>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </migration_features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <topology>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <cells num='1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <cell id='0'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:          <memory unit='KiB'>7864320</memory>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:          <pages unit='KiB' size='2048'>0</pages>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:          <distances>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <sibling id='0' value='10'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:          </distances>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:          <cpus num='8'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:          </cpus>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        </cell>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </cells>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </topology>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <cache>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </cache>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <secmodel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model>selinux</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <doi>0</doi>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </secmodel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <secmodel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model>dac</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <doi>0</doi>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </secmodel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </host>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <guest>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <os_type>hvm</os_type>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <arch name='i686'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <wordsize>32</wordsize>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <domain type='qemu'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <domain type='kvm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </arch>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <pae/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <nonpae/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <acpi default='on' toggle='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <apic default='on' toggle='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <cpuselection/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <deviceboot/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <disksnapshot default='on' toggle='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <externalSnapshot/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </guest>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <guest>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <os_type>hvm</os_type>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <arch name='x86_64'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <wordsize>64</wordsize>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <domain type='qemu'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <domain type='kvm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </arch>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <acpi default='on' toggle='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <apic default='on' toggle='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <cpuselection/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <deviceboot/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <disksnapshot default='on' toggle='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <externalSnapshot/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </guest>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 
Nov 28 19:48:12 np0005539279 nova_compute[186546]: </capabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: #033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.516 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.547 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 19:48:12 np0005539279 nova_compute[186546]: <domainCapabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <domain>kvm</domain>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <arch>i686</arch>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <vcpu max='4096'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <iothreads supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <os supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <enum name='firmware'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <loader supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>rom</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pflash</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='readonly'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>yes</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>no</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='secure'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>no</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </loader>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </os>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='host-passthrough' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='hostPassthroughMigratable'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>on</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>off</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='maximum' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='maximumMigratable'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>on</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>off</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='host-model' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <vendor>AMD</vendor>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='x2apic'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='hypervisor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='stibp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='overflow-recov'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='succor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='lbrv'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc-scale'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='flushbyasid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='pause-filter'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='pfthreshold'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='disable' name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='custom' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Dhyana-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Genoa'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='auto-ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='auto-ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-128'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-256'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-512'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v6'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v7'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='KnightsMill'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512er'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512pf'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='KnightsMill-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512er'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512pf'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G4-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tbm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G5-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tbm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SierraForest'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cmpccxadd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SierraForest-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cmpccxadd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='athlon'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='athlon-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='core2duo'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='core2duo-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='coreduo'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='coreduo-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='n270'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='n270-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='phenom'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='phenom-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <memoryBacking supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <enum name='sourceType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>file</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>anonymous</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>memfd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </memoryBacking>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <devices>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <disk supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='diskDevice'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>disk</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>cdrom</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>floppy</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>lun</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='bus'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>fdc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>scsi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>sata</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-non-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </disk>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <graphics supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vnc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>egl-headless</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dbus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </graphics>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <video supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='modelType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vga</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>cirrus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>none</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>bochs</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ramfb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </video>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <hostdev supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='mode'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>subsystem</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='startupPolicy'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>default</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>mandatory</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>requisite</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>optional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='subsysType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pci</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>scsi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='capsType'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='pciBackend'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </hostdev>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <rng supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-non-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>random</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>egd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>builtin</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </rng>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <filesystem supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='driverType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>path</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>handle</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtiofs</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </filesystem>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <tpm supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tpm-tis</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tpm-crb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>emulator</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>external</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendVersion'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>2.0</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </tpm>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <redirdev supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='bus'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </redirdev>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <channel supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pty</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>unix</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </channel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <crypto supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>qemu</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>builtin</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </crypto>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <interface supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>default</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>passt</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </interface>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <panic supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>isa</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>hyperv</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </panic>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <console supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>null</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pty</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dev</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>file</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pipe</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>stdio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>udp</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tcp</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>unix</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>qemu-vdagent</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dbus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </console>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </devices>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <gic supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <vmcoreinfo supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <genid supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <backingStoreInput supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <backup supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <async-teardown supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <ps2 supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <sev supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <sgx supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <hyperv supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='features'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>relaxed</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vapic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>spinlocks</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vpindex</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>runtime</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>synic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>stimer</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>reset</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vendor_id</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>frequencies</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>reenlightenment</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tlbflush</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ipi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>avic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>emsr_bitmap</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>xmm_input</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <defaults>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <spinlocks>4095</spinlocks>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <stimer_direct>on</stimer_direct>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </defaults>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </hyperv>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <launchSecurity supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='sectype'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tdx</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </launchSecurity>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: </domainCapabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.558 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 19:48:12 np0005539279 nova_compute[186546]: <domainCapabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <domain>kvm</domain>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <arch>i686</arch>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <vcpu max='240'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <iothreads supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <os supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <enum name='firmware'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <loader supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>rom</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pflash</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='readonly'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>yes</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>no</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='secure'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>no</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </loader>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </os>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='host-passthrough' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='hostPassthroughMigratable'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>on</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>off</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='maximum' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='maximumMigratable'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>on</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>off</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='host-model' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <vendor>AMD</vendor>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='x2apic'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='hypervisor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='stibp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='overflow-recov'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='succor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='lbrv'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc-scale'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='flushbyasid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='pause-filter'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='pfthreshold'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='disable' name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='custom' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Dhyana-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Genoa'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='auto-ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='auto-ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-128'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-256'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-512'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v6'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v7'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='KnightsMill'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512er'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512pf'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='KnightsMill-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512er'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512pf'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G4-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tbm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G5-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tbm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SierraForest'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cmpccxadd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SierraForest-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cmpccxadd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='athlon'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='athlon-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='core2duo'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='core2duo-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='coreduo'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='coreduo-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='n270'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='n270-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='phenom'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='phenom-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <memoryBacking supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <enum name='sourceType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>file</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>anonymous</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>memfd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </memoryBacking>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <devices>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <disk supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='diskDevice'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>disk</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>cdrom</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>floppy</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>lun</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='bus'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ide</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>fdc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>scsi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>sata</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-non-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </disk>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <graphics supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vnc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>egl-headless</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dbus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </graphics>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <video supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='modelType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vga</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>cirrus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>none</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>bochs</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ramfb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </video>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <hostdev supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='mode'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>subsystem</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='startupPolicy'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>default</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>mandatory</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>requisite</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>optional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='subsysType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pci</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>scsi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='capsType'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='pciBackend'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </hostdev>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <rng supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-non-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>random</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>egd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>builtin</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </rng>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <filesystem supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='driverType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>path</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>handle</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtiofs</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </filesystem>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <tpm supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tpm-tis</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tpm-crb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>emulator</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>external</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendVersion'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>2.0</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </tpm>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <redirdev supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='bus'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </redirdev>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <channel supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pty</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>unix</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </channel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <crypto supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>qemu</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>builtin</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </crypto>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <interface supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>default</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>passt</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </interface>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <panic supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>isa</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>hyperv</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </panic>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <console supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>null</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pty</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dev</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>file</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pipe</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>stdio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>udp</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tcp</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>unix</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>qemu-vdagent</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dbus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </console>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </devices>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <gic supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <vmcoreinfo supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <genid supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <backingStoreInput supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <backup supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <async-teardown supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <ps2 supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <sev supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <sgx supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <hyperv supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='features'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>relaxed</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vapic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>spinlocks</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vpindex</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>runtime</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>synic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>stimer</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>reset</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vendor_id</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>frequencies</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>reenlightenment</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tlbflush</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ipi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>avic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>emsr_bitmap</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>xmm_input</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <defaults>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <spinlocks>4095</spinlocks>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <stimer_direct>on</stimer_direct>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </defaults>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </hyperv>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <launchSecurity supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='sectype'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tdx</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </launchSecurity>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: </domainCapabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.611 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.617 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 19:48:12 np0005539279 nova_compute[186546]: <domainCapabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <domain>kvm</domain>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <arch>x86_64</arch>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <vcpu max='4096'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <iothreads supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <os supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <enum name='firmware'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>efi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <loader supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>rom</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pflash</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='readonly'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>yes</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>no</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='secure'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>yes</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>no</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </loader>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </os>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='host-passthrough' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='hostPassthroughMigratable'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>on</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>off</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='maximum' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='maximumMigratable'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>on</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>off</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='host-model' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <vendor>AMD</vendor>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='x2apic'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='hypervisor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='stibp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='overflow-recov'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='succor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='lbrv'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc-scale'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='flushbyasid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='pause-filter'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='pfthreshold'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='disable' name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='custom' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Dhyana-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Genoa'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='auto-ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='auto-ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-128'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-256'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-512'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v6'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v7'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='KnightsMill'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512er'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512pf'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='KnightsMill-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512er'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512pf'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G4-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tbm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G5-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tbm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SierraForest'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cmpccxadd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SierraForest-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cmpccxadd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='athlon'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='athlon-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='core2duo'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='core2duo-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='coreduo'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='coreduo-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='n270'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='n270-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='phenom'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='phenom-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <memoryBacking supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <enum name='sourceType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>file</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>anonymous</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>memfd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </memoryBacking>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <devices>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <disk supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='diskDevice'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>disk</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>cdrom</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>floppy</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>lun</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='bus'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>fdc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>scsi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>sata</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-non-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </disk>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <graphics supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vnc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>egl-headless</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dbus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </graphics>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <video supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='modelType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vga</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>cirrus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>none</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>bochs</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ramfb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </video>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <hostdev supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='mode'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>subsystem</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='startupPolicy'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>default</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>mandatory</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>requisite</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>optional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='subsysType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pci</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>scsi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='capsType'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='pciBackend'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </hostdev>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <rng supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-non-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>random</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>egd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>builtin</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </rng>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <filesystem supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='driverType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>path</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>handle</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtiofs</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </filesystem>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <tpm supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tpm-tis</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tpm-crb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>emulator</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>external</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendVersion'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>2.0</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </tpm>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <redirdev supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='bus'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </redirdev>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <channel supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pty</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>unix</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </channel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <crypto supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>qemu</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>builtin</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </crypto>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <interface supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>default</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>passt</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </interface>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <panic supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>isa</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>hyperv</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </panic>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <console supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>null</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pty</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dev</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>file</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pipe</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>stdio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>udp</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tcp</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>unix</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>qemu-vdagent</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dbus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </console>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </devices>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <gic supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <vmcoreinfo supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <genid supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <backingStoreInput supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <backup supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <async-teardown supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <ps2 supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <sev supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <sgx supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <hyperv supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='features'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>relaxed</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vapic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>spinlocks</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vpindex</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>runtime</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>synic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>stimer</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>reset</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vendor_id</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>frequencies</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>reenlightenment</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tlbflush</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ipi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>avic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>emsr_bitmap</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>xmm_input</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <defaults>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <spinlocks>4095</spinlocks>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <stimer_direct>on</stimer_direct>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </defaults>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </hyperv>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <launchSecurity supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='sectype'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tdx</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </launchSecurity>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: </domainCapabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.687 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 19:48:12 np0005539279 nova_compute[186546]: <domainCapabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <domain>kvm</domain>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <arch>x86_64</arch>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <vcpu max='240'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <iothreads supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <os supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <enum name='firmware'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <loader supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>rom</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pflash</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='readonly'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>yes</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>no</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='secure'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>no</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </loader>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </os>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='host-passthrough' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='hostPassthroughMigratable'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>on</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>off</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='maximum' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='maximumMigratable'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>on</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>off</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='host-model' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <vendor>AMD</vendor>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='x2apic'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='hypervisor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='stibp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='overflow-recov'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='succor'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='lbrv'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='tsc-scale'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='flushbyasid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='pause-filter'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='pfthreshold'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <feature policy='disable' name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <mode name='custom' supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Broadwell-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Cooperlake-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Denverton-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Dhyana-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Genoa'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='auto-ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='auto-ibrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Milan-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amd-psfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='stibp-always-on'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-Rome-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='EPYC-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='GraniteRapids-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-128'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-256'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx10-512'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='prefetchiti'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Haswell-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v6'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Icelake-Server-v7'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='IvyBridge-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='KnightsMill'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512er'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512pf'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='KnightsMill-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512er'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512pf'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G4-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tbm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Opteron_G5-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fma4'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tbm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xop'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SapphireRapids-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='amx-tile'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-bf16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-fp16'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bitalg'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrc'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fzrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='la57'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='taa-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xfd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SierraForest'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cmpccxadd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='SierraForest-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ifma'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cmpccxadd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fbsdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='fsrs'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ibrs-all'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mcdt-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pbrsb-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='psdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='serialize'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vaes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Client-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='hle'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='rtm'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Skylake-Server-v5'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512bw'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512cd'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512dq'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512f'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='avx512vl'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='invpcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pcid'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='pku'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='mpx'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v2'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v3'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='core-capability'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='split-lock-detect'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='Snowridge-v4'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='cldemote'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='erms'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='gfni'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdir64b'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='movdiri'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='xsaves'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='athlon'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='athlon-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='core2duo'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='core2duo-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='coreduo'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='coreduo-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='n270'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='n270-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='ss'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='phenom'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <blockers model='phenom-v1'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnow'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <feature name='3dnowext'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </blockers>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </mode>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </cpu>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <memoryBacking supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <enum name='sourceType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>file</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>anonymous</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <value>memfd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </memoryBacking>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <devices>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <disk supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='diskDevice'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>disk</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>cdrom</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>floppy</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>lun</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='bus'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ide</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>fdc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>scsi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>sata</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-non-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </disk>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <graphics supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vnc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>egl-headless</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dbus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </graphics>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <video supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='modelType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vga</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>cirrus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>none</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>bochs</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ramfb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </video>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <hostdev supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='mode'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>subsystem</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='startupPolicy'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>default</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>mandatory</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>requisite</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>optional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='subsysType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pci</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>scsi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='capsType'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='pciBackend'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </hostdev>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <rng supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtio-non-transitional</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>random</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>egd</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>builtin</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </rng>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <filesystem supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='driverType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>path</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>handle</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>virtiofs</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </filesystem>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <tpm supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tpm-tis</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tpm-crb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>emulator</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>external</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendVersion'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>2.0</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </tpm>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <redirdev supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='bus'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>usb</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </redirdev>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <channel supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pty</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>unix</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </channel>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <crypto supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>qemu</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendModel'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>builtin</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </crypto>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <interface supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='backendType'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>default</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>passt</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </interface>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <panic supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='model'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>isa</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>hyperv</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </panic>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <console supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='type'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>null</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vc</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pty</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dev</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>file</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>pipe</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>stdio</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>udp</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tcp</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>unix</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>qemu-vdagent</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>dbus</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </console>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </devices>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  <features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <gic supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <vmcoreinfo supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <genid supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <backingStoreInput supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <backup supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <async-teardown supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <ps2 supported='yes'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <sev supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <sgx supported='no'/>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <hyperv supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='features'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>relaxed</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vapic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>spinlocks</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vpindex</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>runtime</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>synic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>stimer</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>reset</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>vendor_id</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>frequencies</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>reenlightenment</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tlbflush</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>ipi</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>avic</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>emsr_bitmap</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>xmm_input</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <defaults>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <spinlocks>4095</spinlocks>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <stimer_direct>on</stimer_direct>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </defaults>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </hyperv>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    <launchSecurity supported='yes'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      <enum name='sectype'>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:        <value>tdx</value>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:      </enum>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:    </launchSecurity>
Nov 28 19:48:12 np0005539279 nova_compute[186546]:  </features>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: </domainCapabilities>
Nov 28 19:48:12 np0005539279 nova_compute[186546]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.749 186552 DEBUG nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.749 186552 INFO nova.virt.libvirt.host [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Secure Boot support detected#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.752 186552 INFO nova.virt.libvirt.driver [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.752 186552 INFO nova.virt.libvirt.driver [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.765 186552 DEBUG nova.virt.libvirt.driver [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.814 186552 INFO nova.virt.node [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Determined node identity 15673c9a-eee0-47b4-b3d3-728a0fedb147 from /var/lib/nova/compute_id#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.892 186552 WARNING nova.compute.manager [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Compute nodes ['15673c9a-eee0-47b4-b3d3-728a0fedb147'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 28 19:48:12 np0005539279 podman[187290]: 2025-11-29 00:48:12.90949112 +0000 UTC m=+0.135730191 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.944 186552 INFO nova.compute.manager [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.989 186552 WARNING nova.compute.manager [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.989 186552 DEBUG oslo_concurrency.lockutils [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.989 186552 DEBUG oslo_concurrency.lockutils [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.990 186552 DEBUG oslo_concurrency.lockutils [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:48:12 np0005539279 nova_compute[186546]: 2025-11-29 00:48:12.990 186552 DEBUG nova.compute.resource_tracker [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:48:13 np0005539279 systemd[1]: Starting libvirt nodedev daemon...
Nov 28 19:48:13 np0005539279 systemd[1]: Started libvirt nodedev daemon.
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.314 186552 WARNING nova.virt.libvirt.driver [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.315 186552 DEBUG nova.compute.resource_tracker [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6188MB free_disk=73.54586029052734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.315 186552 DEBUG oslo_concurrency.lockutils [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.315 186552 DEBUG oslo_concurrency.lockutils [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.338 186552 WARNING nova.compute.resource_tracker [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] No compute node record for compute-0.ctlplane.example.com:15673c9a-eee0-47b4-b3d3-728a0fedb147: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 15673c9a-eee0-47b4-b3d3-728a0fedb147 could not be found.#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.364 186552 INFO nova.compute.resource_tracker [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 15673c9a-eee0-47b4-b3d3-728a0fedb147#033[00m
Nov 28 19:48:13 np0005539279 python3.9[187451]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:48:13 np0005539279 systemd[1]: Stopping nova_compute container...
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.605 186552 DEBUG nova.compute.resource_tracker [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.606 186552 DEBUG nova.compute.resource_tracker [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.619 186552 DEBUG oslo_concurrency.lockutils [None req-34cb1cc9-d628-41f2-8d14-3f5226260958 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.619 186552 DEBUG oslo_concurrency.lockutils [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.620 186552 DEBUG oslo_concurrency.lockutils [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:48:13 np0005539279 nova_compute[186546]: 2025-11-29 00:48:13.620 186552 DEBUG oslo_concurrency.lockutils [None req-908248ec-1d9b-4e1b-bffc-4a441fad47c2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:48:13 np0005539279 virtqemud[187089]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 28 19:48:13 np0005539279 virtqemud[187089]: hostname: compute-0
Nov 28 19:48:13 np0005539279 virtqemud[187089]: End of file while reading data: Input/output error
Nov 28 19:48:13 np0005539279 systemd[1]: libpod-f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587.scope: Deactivated successfully.
Nov 28 19:48:13 np0005539279 systemd[1]: libpod-f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587.scope: Consumed 3.199s CPU time.
Nov 28 19:48:13 np0005539279 podman[187457]: 2025-11-29 00:48:13.986509203 +0000 UTC m=+0.431378164 container died f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 28 19:48:14 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587-userdata-shm.mount: Deactivated successfully.
Nov 28 19:48:14 np0005539279 systemd[1]: var-lib-containers-storage-overlay-c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf-merged.mount: Deactivated successfully.
Nov 28 19:48:14 np0005539279 podman[187457]: 2025-11-29 00:48:14.042140263 +0000 UTC m=+0.487009224 container cleanup f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:48:14 np0005539279 podman[187457]: nova_compute
Nov 28 19:48:14 np0005539279 podman[187487]: nova_compute
Nov 28 19:48:14 np0005539279 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 28 19:48:14 np0005539279 systemd[1]: Stopped nova_compute container.
Nov 28 19:48:14 np0005539279 systemd[1]: Starting nova_compute container...
Nov 28 19:48:14 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:48:14 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:14 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:14 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:14 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:14 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1eaa0af78b4a96cacb04ea974acf170cc308a473b6c0aa382a241af616cc2bf/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:14 np0005539279 podman[187500]: 2025-11-29 00:48:14.296943148 +0000 UTC m=+0.124101443 container init f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:48:14 np0005539279 podman[187500]: 2025-11-29 00:48:14.308604387 +0000 UTC m=+0.135762652 container start f98580fe64a6998b4af45cb6c089f7fbb3e18260a4851a1c7baeb65623df4587 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 19:48:14 np0005539279 podman[187500]: nova_compute
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + sudo -E kolla_set_configs
Nov 28 19:48:14 np0005539279 systemd[1]: Started nova_compute container.
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Validating config file
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying service configuration files
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /etc/ceph
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Creating directory /etc/ceph
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Writing out command to execute
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 19:48:14 np0005539279 nova_compute[187514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 19:48:14 np0005539279 nova_compute[187514]: ++ cat /run_command
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + CMD=nova-compute
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + ARGS=
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + sudo kolla_copy_cacerts
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + [[ ! -n '' ]]
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + . kolla_extend_start
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 19:48:14 np0005539279 nova_compute[187514]: Running command: 'nova-compute'
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + umask 0022
Nov 28 19:48:14 np0005539279 nova_compute[187514]: + exec nova-compute
Nov 28 19:48:14 np0005539279 podman[187597]: 2025-11-29 00:48:14.850795139 +0000 UTC m=+0.091451351 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 19:48:15 np0005539279 python3.9[187693]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 19:48:15 np0005539279 systemd[1]: Started libpod-conmon-eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994.scope.
Nov 28 19:48:15 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:48:15 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11b30e94b15ee454576a64b3a9efe4c6e6f430dce21e9375a6dbfe8cbc372e6/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:15 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11b30e94b15ee454576a64b3a9efe4c6e6f430dce21e9375a6dbfe8cbc372e6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:15 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d11b30e94b15ee454576a64b3a9efe4c6e6f430dce21e9375a6dbfe8cbc372e6/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 19:48:15 np0005539279 podman[187719]: 2025-11-29 00:48:15.674142747 +0000 UTC m=+0.194689783 container init eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 19:48:15 np0005539279 podman[187719]: 2025-11-29 00:48:15.686352281 +0000 UTC m=+0.206899267 container start eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 19:48:15 np0005539279 python3.9[187693]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Applying nova statedir ownership
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 28 19:48:15 np0005539279 nova_compute_init[187741]: INFO:nova_statedir:Nova statedir ownership complete
Nov 28 19:48:15 np0005539279 systemd[1]: libpod-eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994.scope: Deactivated successfully.
Nov 28 19:48:15 np0005539279 podman[187755]: 2025-11-29 00:48:15.841100131 +0000 UTC m=+0.048133267 container died eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 28 19:48:15 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994-userdata-shm.mount: Deactivated successfully.
Nov 28 19:48:15 np0005539279 systemd[1]: var-lib-containers-storage-overlay-d11b30e94b15ee454576a64b3a9efe4c6e6f430dce21e9375a6dbfe8cbc372e6-merged.mount: Deactivated successfully.
Nov 28 19:48:15 np0005539279 podman[187755]: 2025-11-29 00:48:15.876017996 +0000 UTC m=+0.083051132 container cleanup eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 28 19:48:15 np0005539279 systemd[1]: libpod-conmon-eaa296df7902d34e507460d7b521555a099d888fcae1a2667e1a40364fab9994.scope: Deactivated successfully.
Nov 28 19:48:16 np0005539279 systemd-logind[811]: Session 24 logged out. Waiting for processes to exit.
Nov 28 19:48:16 np0005539279 systemd[1]: session-24.scope: Deactivated successfully.
Nov 28 19:48:16 np0005539279 systemd[1]: session-24.scope: Consumed 2min 16.795s CPU time.
Nov 28 19:48:16 np0005539279 systemd-logind[811]: Removed session 24.
Nov 28 19:48:16 np0005539279 nova_compute[187514]: 2025-11-29 00:48:16.438 187518 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 19:48:16 np0005539279 nova_compute[187514]: 2025-11-29 00:48:16.438 187518 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 19:48:16 np0005539279 nova_compute[187514]: 2025-11-29 00:48:16.439 187518 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 19:48:16 np0005539279 nova_compute[187514]: 2025-11-29 00:48:16.439 187518 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 28 19:48:16 np0005539279 nova_compute[187514]: 2025-11-29 00:48:16.570 187518 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:48:16 np0005539279 nova_compute[187514]: 2025-11-29 00:48:16.599 187518 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:48:16 np0005539279 nova_compute[187514]: 2025-11-29 00:48:16.599 187518 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.091 187518 INFO nova.virt.driver [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.216 187518 INFO nova.compute.provider_config [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.235 187518 DEBUG oslo_concurrency.lockutils [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.235 187518 DEBUG oslo_concurrency.lockutils [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.236 187518 DEBUG oslo_concurrency.lockutils [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.237 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.237 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.237 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.237 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.238 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.238 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.238 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.238 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.239 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.239 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.239 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.239 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.240 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.240 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.240 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.241 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.241 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.241 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.241 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.242 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.242 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.242 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.242 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.243 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.243 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.243 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.243 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.243 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.244 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.244 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.244 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.244 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.244 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.245 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.245 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.245 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.245 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.245 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.246 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.246 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.246 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.246 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.247 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.247 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.247 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.247 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.247 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.248 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.248 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.248 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.248 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.248 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.249 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.249 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.249 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.249 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.249 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.250 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.250 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.250 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.250 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.250 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.250 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.251 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.251 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.251 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.251 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.251 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.252 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.252 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.252 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.252 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.253 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.253 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.253 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.253 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.254 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.254 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.254 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.254 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.255 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.255 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.255 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.255 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.256 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.256 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.256 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.256 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.257 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.257 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.257 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.257 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.257 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.258 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.258 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.258 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.258 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.258 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.259 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.259 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.259 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.259 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.260 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.260 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.260 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.260 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.260 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.261 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.261 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.261 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.261 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.261 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.261 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.262 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.262 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.262 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.262 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.262 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.262 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.263 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.263 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.263 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.263 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.263 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.263 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.263 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.264 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.264 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.264 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.264 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.264 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.264 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.264 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.265 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.265 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.265 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.265 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.265 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.265 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.265 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.265 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.266 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.266 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.266 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.266 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.266 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.266 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.266 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.267 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.267 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.267 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.267 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.267 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.267 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.267 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.268 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.268 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.268 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.268 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.268 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.268 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.268 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.269 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.269 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.269 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.269 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.269 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.269 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.270 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.270 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.270 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.270 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.270 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.270 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.270 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.271 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.271 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.271 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.271 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.271 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.271 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.271 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.272 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.272 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.272 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.272 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.272 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.272 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.272 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.272 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.273 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.273 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.273 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.273 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.273 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.273 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.273 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.274 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.274 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.274 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.274 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.274 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.274 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.275 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.275 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.275 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.275 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.275 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.275 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.275 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.275 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.276 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.276 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.276 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.276 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.276 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.276 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.276 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.277 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.277 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.277 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.277 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.277 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.277 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.278 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.278 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.278 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.278 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.278 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.279 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.279 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.279 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.279 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.279 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.279 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.280 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.280 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.280 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.280 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.280 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.281 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.281 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.281 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.281 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.281 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.281 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.282 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.282 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.282 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.282 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.282 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.282 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.283 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.283 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.283 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.283 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.283 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.283 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.283 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.284 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.284 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.284 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.284 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.284 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.284 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.284 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.285 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.285 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.285 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.285 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.285 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.285 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.286 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.286 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.286 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.286 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.286 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.286 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.286 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.287 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.287 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.287 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.287 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.287 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.288 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.288 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.288 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.288 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.288 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.288 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.288 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.289 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.289 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.289 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.289 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.289 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.289 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.289 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.290 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.290 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.290 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.290 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.290 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.291 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.291 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.291 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.291 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.291 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.291 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.292 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.292 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.292 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.292 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.292 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.292 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.292 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.293 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.293 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.293 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.293 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.293 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.294 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.294 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.294 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.294 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.294 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.294 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.294 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.295 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.295 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.295 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.295 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.295 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.295 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.295 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.296 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.296 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.296 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.296 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.296 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.296 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.297 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.297 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.297 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.297 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.297 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.297 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.297 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.298 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.298 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.298 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.298 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.298 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.298 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.298 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.299 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.299 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.299 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.299 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.299 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.300 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.300 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.300 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.300 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.300 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.300 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.301 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.301 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.301 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.301 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.301 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.302 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.302 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.302 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.302 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.302 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.303 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.304 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.304 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.304 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.305 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.305 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.306 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.306 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.306 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.307 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.307 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.307 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.307 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.307 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.308 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.308 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.308 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.308 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.308 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.308 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.308 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.309 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.309 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.309 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.309 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.309 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.309 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.309 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.310 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.310 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.310 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.310 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.310 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.310 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.310 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.311 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.311 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.311 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.311 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.311 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.311 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.311 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.312 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.312 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.312 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.312 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.312 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.312 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.312 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.313 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.313 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.313 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.313 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.313 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.313 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.313 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.313 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.314 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.314 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.314 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.314 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.314 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.314 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.314 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.315 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.315 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.315 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.315 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.315 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.315 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.315 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.316 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.316 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.316 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.316 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.316 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.316 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.316 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.317 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.317 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.317 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.317 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.317 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.317 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.318 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.318 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.318 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.318 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.318 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.318 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.319 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.319 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.319 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.319 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.319 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.319 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.320 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.320 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.320 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.320 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.320 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.320 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.320 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.321 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.321 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.321 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.321 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.321 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.321 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.321 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.321 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.322 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.322 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.322 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.322 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.322 187518 WARNING oslo_config.cfg [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 19:48:17 np0005539279 nova_compute[187514]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 19:48:17 np0005539279 nova_compute[187514]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 19:48:17 np0005539279 nova_compute[187514]: and ``live_migration_inbound_addr`` respectively.
Nov 28 19:48:17 np0005539279 nova_compute[187514]: ).  Its value may be silently ignored in the future.#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.323 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.323 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.323 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.323 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.323 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.323 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.324 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.324 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.324 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.324 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.324 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.324 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.325 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.325 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.325 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.325 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.325 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.325 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.325 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.326 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.326 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.326 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.326 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.326 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.326 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.326 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.327 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.327 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.327 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.327 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.327 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.327 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.327 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.328 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.328 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.328 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.328 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.328 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.328 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.328 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.329 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.329 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.329 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.329 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.329 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.329 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.329 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.330 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.330 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.330 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.330 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.330 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.330 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.330 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.331 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.331 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.331 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.331 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.331 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.331 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.331 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.332 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.332 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.332 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.332 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.332 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.332 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.332 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.333 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.333 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.333 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.333 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.333 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.333 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.333 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.334 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.334 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.334 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.334 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.334 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.334 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.334 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.335 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.335 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.335 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.335 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.335 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.335 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.336 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.336 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.336 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.336 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.336 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.336 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.336 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.337 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.337 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.337 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.337 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.337 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.337 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.337 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.338 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.338 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.338 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.338 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.338 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.338 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.338 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.339 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.339 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.339 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.339 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.339 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.339 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.340 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.340 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.340 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.340 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.340 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.340 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.340 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.340 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.341 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.341 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.341 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.341 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.341 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.341 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.341 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.342 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.342 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.342 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.342 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.342 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.342 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.342 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.343 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.343 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.343 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.343 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.343 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.343 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.344 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.344 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.344 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.344 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.344 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.344 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.344 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.345 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.345 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.345 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.345 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.345 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.345 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.345 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.346 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.346 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.346 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.346 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.346 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.346 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.346 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.347 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.347 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.347 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.347 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.347 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.347 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.347 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.348 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.348 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.348 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.348 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.348 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.348 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.348 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.349 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.349 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.349 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.349 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.349 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.349 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.350 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.350 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.350 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.350 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.350 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.350 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.350 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.351 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.351 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.351 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.351 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.351 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.351 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.351 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.352 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.352 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.352 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.352 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.352 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.352 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.352 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.353 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.353 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.353 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.353 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.353 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.353 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.353 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.354 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.354 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.354 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.354 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.354 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.354 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.354 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.355 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.355 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.355 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.355 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.355 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.355 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.355 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.356 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.356 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.356 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.356 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.356 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.356 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.356 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.357 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.357 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.357 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.357 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.357 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.357 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.357 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.358 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.358 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.358 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.358 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.358 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.358 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.358 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.359 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.359 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.359 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.359 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.359 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.359 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.360 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.360 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.360 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.360 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.360 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.360 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.360 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.361 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.361 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.361 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.361 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.361 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.361 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.362 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.362 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.362 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.362 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.362 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.362 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.363 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.363 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.363 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.363 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.363 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.363 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.363 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.364 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.364 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.364 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.364 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.364 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.364 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.365 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.365 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.365 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.365 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.365 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.366 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.366 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.366 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.366 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.366 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.367 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.367 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.367 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.367 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.367 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.368 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.368 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.368 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.368 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.368 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.369 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.369 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.369 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.369 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.369 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.369 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.370 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.370 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.370 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.370 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.370 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.370 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.370 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.371 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.371 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.371 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.371 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.371 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.371 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.372 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.372 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.372 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.372 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.372 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.372 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.372 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.372 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.373 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.373 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.373 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.373 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.373 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.373 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.374 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.374 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.374 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.374 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.374 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.374 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.374 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.375 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.375 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.375 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.375 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.375 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.375 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.375 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.375 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.376 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.376 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.376 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.376 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.376 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.376 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.376 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.377 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.377 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.377 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.377 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.377 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.377 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.377 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.378 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.378 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.378 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.378 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.378 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.378 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.378 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.378 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.379 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.379 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.379 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.379 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.379 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.379 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.379 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.380 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.380 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.380 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.380 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.380 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.380 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.380 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.381 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.381 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.381 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.381 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.381 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.381 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.381 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.382 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.382 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.382 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.382 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.382 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.382 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.382 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.383 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.383 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.383 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.383 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.383 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.383 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.383 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.383 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.384 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.384 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.384 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.384 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.384 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.384 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.384 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.385 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.385 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.385 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.385 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.385 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.385 187518 DEBUG oslo_service.service [None req-ae87fd41-0377-4e02-8d8f-6af08c50df2b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.386 187518 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.406 187518 INFO nova.virt.node [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Determined node identity 15673c9a-eee0-47b4-b3d3-728a0fedb147 from /var/lib/nova/compute_id#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.407 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.407 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.407 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.408 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.425 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fdb3e24e250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.428 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fdb3e24e250> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.429 187518 INFO nova.virt.libvirt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.442 187518 INFO nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <host>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <uuid>0b852fc4-ac1f-4ef6-847b-925a46032b4e</uuid>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <arch>x86_64</arch>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model>EPYC-Rome-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <vendor>AMD</vendor>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <microcode version='16777317'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <signature family='23' model='49' stepping='0'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='x2apic'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='tsc-deadline'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='osxsave'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='hypervisor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='tsc_adjust'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='spec-ctrl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='stibp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='arch-capabilities'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='cmp_legacy'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='topoext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='virt-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='lbrv'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='tsc-scale'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='vmcb-clean'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='pause-filter'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='pfthreshold'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='svme-addr-chk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='rdctl-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='skip-l1dfl-vmentry'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='mds-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature name='pschange-mc-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <pages unit='KiB' size='4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <pages unit='KiB' size='2048'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <pages unit='KiB' size='1048576'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <power_management>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <suspend_mem/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <suspend_disk/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <suspend_hybrid/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </power_management>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <iommu support='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <migration_features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <live/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <uri_transports>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <uri_transport>tcp</uri_transport>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <uri_transport>rdma</uri_transport>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </uri_transports>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </migration_features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <topology>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <cells num='1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <cell id='0'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:          <memory unit='KiB'>7864320</memory>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:          <pages unit='KiB' size='4'>1966080</pages>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:          <pages unit='KiB' size='2048'>0</pages>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:          <distances>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <sibling id='0' value='10'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:          </distances>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:          <cpus num='8'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:          </cpus>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        </cell>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </cells>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </topology>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <cache>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </cache>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <secmodel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model>selinux</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <doi>0</doi>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </secmodel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <secmodel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model>dac</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <doi>0</doi>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </secmodel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </host>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <guest>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <os_type>hvm</os_type>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <arch name='i686'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <wordsize>32</wordsize>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <domain type='qemu'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <domain type='kvm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </arch>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <pae/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <nonpae/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <acpi default='on' toggle='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <apic default='on' toggle='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <cpuselection/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <deviceboot/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <disksnapshot default='on' toggle='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <externalSnapshot/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </guest>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <guest>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <os_type>hvm</os_type>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <arch name='x86_64'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <wordsize>64</wordsize>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <domain type='qemu'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <domain type='kvm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </arch>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <acpi default='on' toggle='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <apic default='on' toggle='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <cpuselection/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <deviceboot/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <disksnapshot default='on' toggle='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <externalSnapshot/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </guest>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 
Nov 28 19:48:17 np0005539279 nova_compute[187514]: </capabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: #033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.448 187518 DEBUG nova.virt.libvirt.volume.mount [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.450 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.457 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 19:48:17 np0005539279 nova_compute[187514]: <domainCapabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <domain>kvm</domain>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <arch>i686</arch>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <vcpu max='4096'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <iothreads supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <os supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <enum name='firmware'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <loader supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>rom</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pflash</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='readonly'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>yes</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>no</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='secure'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>no</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </loader>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='host-passthrough' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='hostPassthroughMigratable'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>on</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>off</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='maximum' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='maximumMigratable'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>on</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>off</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='host-model' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <vendor>AMD</vendor>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='x2apic'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='hypervisor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='stibp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='overflow-recov'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='succor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='lbrv'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc-scale'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='flushbyasid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='pause-filter'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='pfthreshold'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='disable' name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='custom' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Dhyana-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Genoa'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='auto-ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='auto-ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-128'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-256'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-512'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v6'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v7'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='KnightsMill'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512er'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512pf'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='KnightsMill-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512er'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512pf'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G4-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tbm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G5-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tbm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SierraForest'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cmpccxadd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SierraForest-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cmpccxadd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='athlon'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='athlon-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='core2duo'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='core2duo-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='coreduo'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='coreduo-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='n270'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='n270-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='phenom'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='phenom-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <memoryBacking supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <enum name='sourceType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>file</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>anonymous</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>memfd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </memoryBacking>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <disk supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='diskDevice'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>disk</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>cdrom</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>floppy</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>lun</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='bus'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>fdc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>scsi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>sata</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-non-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <graphics supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vnc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>egl-headless</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dbus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </graphics>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <video supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='modelType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vga</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>cirrus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>none</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>bochs</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ramfb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <hostdev supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='mode'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>subsystem</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='startupPolicy'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>default</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>mandatory</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>requisite</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>optional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='subsysType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pci</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>scsi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='capsType'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='pciBackend'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </hostdev>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <rng supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-non-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>random</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>egd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>builtin</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <filesystem supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='driverType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>path</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>handle</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtiofs</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </filesystem>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <tpm supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tpm-tis</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tpm-crb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>emulator</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>external</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendVersion'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>2.0</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </tpm>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <redirdev supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='bus'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </redirdev>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <channel supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pty</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>unix</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </channel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <crypto supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>qemu</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>builtin</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </crypto>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <interface supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>default</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>passt</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <panic supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>isa</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>hyperv</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </panic>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <console supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>null</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pty</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dev</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>file</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pipe</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>stdio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>udp</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tcp</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>unix</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>qemu-vdagent</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dbus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </console>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <gic supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <vmcoreinfo supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <genid supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <backingStoreInput supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <backup supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <async-teardown supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <ps2 supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <sev supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <sgx supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <hyperv supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='features'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>relaxed</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vapic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>spinlocks</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vpindex</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>runtime</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>synic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>stimer</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>reset</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vendor_id</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>frequencies</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>reenlightenment</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tlbflush</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ipi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>avic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>emsr_bitmap</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>xmm_input</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <defaults>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <spinlocks>4095</spinlocks>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <stimer_direct>on</stimer_direct>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </defaults>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </hyperv>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <launchSecurity supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='sectype'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tdx</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </launchSecurity>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: </domainCapabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.462 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 19:48:17 np0005539279 nova_compute[187514]: <domainCapabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <domain>kvm</domain>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <arch>i686</arch>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <vcpu max='240'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <iothreads supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <os supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <enum name='firmware'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <loader supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>rom</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pflash</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='readonly'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>yes</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>no</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='secure'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>no</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </loader>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='host-passthrough' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='hostPassthroughMigratable'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>on</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>off</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='maximum' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='maximumMigratable'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>on</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>off</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='host-model' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <vendor>AMD</vendor>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='x2apic'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='hypervisor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='stibp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='overflow-recov'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='succor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='lbrv'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc-scale'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='flushbyasid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='pause-filter'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='pfthreshold'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='disable' name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='custom' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Dhyana-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Genoa'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='auto-ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='auto-ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-128'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-256'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-512'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v6'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v7'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='KnightsMill'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512er'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512pf'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='KnightsMill-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512er'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512pf'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G4-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tbm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G5-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tbm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SierraForest'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cmpccxadd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SierraForest-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cmpccxadd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='athlon'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='athlon-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='core2duo'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='core2duo-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='coreduo'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='coreduo-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='n270'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='n270-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='phenom'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='phenom-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <memoryBacking supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <enum name='sourceType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>file</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>anonymous</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>memfd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </memoryBacking>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <disk supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='diskDevice'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>disk</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>cdrom</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>floppy</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>lun</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='bus'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ide</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>fdc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>scsi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>sata</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-non-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <graphics supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vnc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>egl-headless</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dbus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </graphics>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <video supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='modelType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vga</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>cirrus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>none</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>bochs</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ramfb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <hostdev supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='mode'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>subsystem</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='startupPolicy'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>default</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>mandatory</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>requisite</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>optional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='subsysType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pci</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>scsi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='capsType'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='pciBackend'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </hostdev>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <rng supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-non-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>random</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>egd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>builtin</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <filesystem supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='driverType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>path</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>handle</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtiofs</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </filesystem>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <tpm supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tpm-tis</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tpm-crb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>emulator</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>external</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendVersion'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>2.0</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </tpm>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <redirdev supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='bus'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </redirdev>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <channel supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pty</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>unix</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </channel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <crypto supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>qemu</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>builtin</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </crypto>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <interface supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>default</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>passt</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <panic supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>isa</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>hyperv</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </panic>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <console supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>null</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pty</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dev</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>file</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pipe</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>stdio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>udp</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tcp</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>unix</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>qemu-vdagent</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dbus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </console>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <gic supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <vmcoreinfo supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <genid supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <backingStoreInput supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <backup supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <async-teardown supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <ps2 supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <sev supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <sgx supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <hyperv supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='features'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>relaxed</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vapic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>spinlocks</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vpindex</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>runtime</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>synic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>stimer</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>reset</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vendor_id</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>frequencies</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>reenlightenment</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tlbflush</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ipi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>avic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>emsr_bitmap</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>xmm_input</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <defaults>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <spinlocks>4095</spinlocks>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <stimer_direct>on</stimer_direct>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </defaults>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </hyperv>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <launchSecurity supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='sectype'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tdx</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </launchSecurity>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: </domainCapabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.499 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.505 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 19:48:17 np0005539279 nova_compute[187514]: <domainCapabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <domain>kvm</domain>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <arch>x86_64</arch>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <vcpu max='4096'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <iothreads supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <os supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <enum name='firmware'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>efi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <loader supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>rom</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pflash</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='readonly'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>yes</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>no</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='secure'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>yes</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>no</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </loader>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='host-passthrough' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='hostPassthroughMigratable'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>on</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>off</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='maximum' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='maximumMigratable'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>on</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>off</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='host-model' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <vendor>AMD</vendor>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='x2apic'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='hypervisor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='stibp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='overflow-recov'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='succor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='lbrv'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc-scale'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='flushbyasid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='pause-filter'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='pfthreshold'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='disable' name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='custom' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Dhyana-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Genoa'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='auto-ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='auto-ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-128'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-256'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-512'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v6'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v7'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='KnightsMill'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512er'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512pf'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='KnightsMill-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512er'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512pf'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G4-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tbm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G5-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tbm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SierraForest'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cmpccxadd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SierraForest-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cmpccxadd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='athlon'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='athlon-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='core2duo'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='core2duo-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='coreduo'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='coreduo-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='n270'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='n270-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='phenom'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='phenom-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <memoryBacking supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <enum name='sourceType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>file</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>anonymous</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>memfd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </memoryBacking>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <disk supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='diskDevice'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>disk</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>cdrom</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>floppy</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>lun</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='bus'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>fdc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>scsi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>sata</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-non-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <graphics supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vnc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>egl-headless</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dbus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </graphics>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <video supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='modelType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vga</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>cirrus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>none</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>bochs</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ramfb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <hostdev supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='mode'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>subsystem</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='startupPolicy'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>default</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>mandatory</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>requisite</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>optional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='subsysType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pci</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>scsi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='capsType'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='pciBackend'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </hostdev>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <rng supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-non-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>random</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>egd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>builtin</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <filesystem supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='driverType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>path</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>handle</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtiofs</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </filesystem>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <tpm supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tpm-tis</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tpm-crb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>emulator</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>external</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendVersion'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>2.0</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </tpm>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <redirdev supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='bus'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </redirdev>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <channel supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pty</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>unix</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </channel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <crypto supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>qemu</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>builtin</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </crypto>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <interface supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>default</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>passt</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <panic supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>isa</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>hyperv</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </panic>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <console supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>null</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pty</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dev</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>file</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pipe</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>stdio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>udp</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tcp</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>unix</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>qemu-vdagent</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dbus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </console>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <gic supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <vmcoreinfo supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <genid supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <backingStoreInput supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <backup supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <async-teardown supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <ps2 supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <sev supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <sgx supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <hyperv supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='features'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>relaxed</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vapic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>spinlocks</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vpindex</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>runtime</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>synic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>stimer</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>reset</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vendor_id</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>frequencies</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>reenlightenment</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tlbflush</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ipi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>avic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>emsr_bitmap</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>xmm_input</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <defaults>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <spinlocks>4095</spinlocks>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <stimer_direct>on</stimer_direct>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </defaults>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </hyperv>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <launchSecurity supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='sectype'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tdx</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </launchSecurity>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: </domainCapabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.569 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 19:48:17 np0005539279 nova_compute[187514]: <domainCapabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <domain>kvm</domain>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <arch>x86_64</arch>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <vcpu max='240'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <iothreads supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <os supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <enum name='firmware'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <loader supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>rom</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pflash</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='readonly'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>yes</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>no</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='secure'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>no</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </loader>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='host-passthrough' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='hostPassthroughMigratable'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>on</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>off</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='maximum' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='maximumMigratable'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>on</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>off</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='host-model' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <vendor>AMD</vendor>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='x2apic'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='hypervisor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='stibp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='overflow-recov'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='succor'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='lbrv'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='tsc-scale'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='flushbyasid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='pause-filter'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='pfthreshold'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <feature policy='disable' name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <mode name='custom' supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Broadwell-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Cooperlake-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Denverton-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Dhyana-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Genoa'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='auto-ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='auto-ibrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Milan-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amd-psfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='no-nested-data-bp'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='null-sel-clr-base'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='stibp-always-on'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-Rome-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='EPYC-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='GraniteRapids-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-128'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-256'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx10-512'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='prefetchiti'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Haswell-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v6'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Icelake-Server-v7'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='IvyBridge-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='KnightsMill'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512er'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512pf'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='KnightsMill-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4fmaps'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-4vnniw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512er'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512pf'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G4-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tbm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Opteron_G5-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fma4'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tbm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xop'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SapphireRapids-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='amx-tile'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-bf16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-fp16'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512-vpopcntdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bitalg'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vbmi2'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrc'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fzrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='la57'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='taa-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='tsx-ldtrk'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xfd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SierraForest'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cmpccxadd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='SierraForest-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ifma'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-ne-convert'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx-vnni-int8'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='bus-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cmpccxadd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fbsdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='fsrs'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ibrs-all'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mcdt-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pbrsb-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='psdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='sbdr-ssdp-no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='serialize'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vaes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='vpclmulqdq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Client-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='hle'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='rtm'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Skylake-Server-v5'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512bw'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512cd'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512dq'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512f'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='avx512vl'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='invpcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pcid'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='pku'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='mpx'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v2'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v3'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='core-capability'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='split-lock-detect'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='Snowridge-v4'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='cldemote'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='erms'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='gfni'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdir64b'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='movdiri'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='xsaves'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='athlon'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='athlon-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='core2duo'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='core2duo-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='coreduo'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='coreduo-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='n270'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='n270-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='ss'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='phenom'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <blockers model='phenom-v1'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnow'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <feature name='3dnowext'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </blockers>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </mode>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <memoryBacking supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <enum name='sourceType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>file</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>anonymous</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <value>memfd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </memoryBacking>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <disk supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='diskDevice'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>disk</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>cdrom</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>floppy</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>lun</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='bus'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ide</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>fdc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>scsi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>sata</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-non-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <graphics supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vnc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>egl-headless</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dbus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </graphics>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <video supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='modelType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vga</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>cirrus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>none</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>bochs</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ramfb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <hostdev supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='mode'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>subsystem</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='startupPolicy'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>default</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>mandatory</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>requisite</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>optional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='subsysType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pci</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>scsi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='capsType'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='pciBackend'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </hostdev>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <rng supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtio-non-transitional</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>random</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>egd</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>builtin</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <filesystem supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='driverType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>path</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>handle</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>virtiofs</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </filesystem>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <tpm supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tpm-tis</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tpm-crb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>emulator</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>external</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendVersion'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>2.0</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </tpm>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <redirdev supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='bus'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>usb</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </redirdev>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <channel supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pty</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>unix</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </channel>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <crypto supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>qemu</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendModel'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>builtin</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </crypto>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <interface supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='backendType'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>default</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>passt</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <panic supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='model'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>isa</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>hyperv</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </panic>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <console supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='type'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>null</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vc</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pty</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dev</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>file</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>pipe</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>stdio</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>udp</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tcp</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>unix</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>qemu-vdagent</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>dbus</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </console>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <gic supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <vmcoreinfo supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <genid supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <backingStoreInput supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <backup supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <async-teardown supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <ps2 supported='yes'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <sev supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <sgx supported='no'/>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <hyperv supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='features'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>relaxed</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vapic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>spinlocks</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vpindex</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>runtime</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>synic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>stimer</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>reset</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>vendor_id</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>frequencies</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>reenlightenment</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tlbflush</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>ipi</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>avic</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>emsr_bitmap</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>xmm_input</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <defaults>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <spinlocks>4095</spinlocks>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <stimer_direct>on</stimer_direct>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </defaults>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </hyperv>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    <launchSecurity supported='yes'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      <enum name='sectype'>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:        <value>tdx</value>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:      </enum>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:    </launchSecurity>
Nov 28 19:48:17 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: </domainCapabilities>
Nov 28 19:48:17 np0005539279 nova_compute[187514]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.632 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.632 187518 INFO nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Secure Boot support detected#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.635 187518 INFO nova.virt.libvirt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.635 187518 INFO nova.virt.libvirt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.646 187518 DEBUG nova.virt.libvirt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.672 187518 INFO nova.virt.node [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Determined node identity 15673c9a-eee0-47b4-b3d3-728a0fedb147 from /var/lib/nova/compute_id#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.694 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Verified node 15673c9a-eee0-47b4-b3d3-728a0fedb147 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 28 19:48:17 np0005539279 nova_compute[187514]: 2025-11-29 00:48:17.742 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.217 187518 ERROR nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Could not retrieve compute node resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '15673c9a-eee0-47b4-b3d3-728a0fedb147' not found: No resource provider with uuid 15673c9a-eee0-47b4-b3d3-728a0fedb147 found  ", "request_id": "req-e5fa334d-9fb5-426e-85c6-2cc38f767110"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '15673c9a-eee0-47b4-b3d3-728a0fedb147' not found: No resource provider with uuid 15673c9a-eee0-47b4-b3d3-728a0fedb147 found  ", "request_id": "req-e5fa334d-9fb5-426e-85c6-2cc38f767110"}]}#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.261 187518 DEBUG oslo_concurrency.lockutils [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.262 187518 DEBUG oslo_concurrency.lockutils [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.262 187518 DEBUG oslo_concurrency.lockutils [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.263 187518 DEBUG nova.compute.resource_tracker [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.448 187518 WARNING nova.virt.libvirt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.449 187518 DEBUG nova.compute.resource_tracker [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6165MB free_disk=73.54392623901367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.449 187518 DEBUG oslo_concurrency.lockutils [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.450 187518 DEBUG oslo_concurrency.lockutils [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.644 187518 ERROR nova.compute.resource_tracker [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '15673c9a-eee0-47b4-b3d3-728a0fedb147' not found: No resource provider with uuid 15673c9a-eee0-47b4-b3d3-728a0fedb147 found  ", "request_id": "req-c65d2cc8-444b-4c17-970b-26722e0bd4a9"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '15673c9a-eee0-47b4-b3d3-728a0fedb147' not found: No resource provider with uuid 15673c9a-eee0-47b4-b3d3-728a0fedb147 found  ", "request_id": "req-c65d2cc8-444b-4c17-970b-26722e0bd4a9"}]}#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.645 187518 DEBUG nova.compute.resource_tracker [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:48:18 np0005539279 nova_compute[187514]: 2025-11-29 00:48:18.646 187518 DEBUG nova.compute.resource_tracker [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.144 187518 INFO nova.scheduler.client.report [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [req-d612a49d-86d0-4eb6-98da-73004cf1cfd9] Created resource provider record via placement API for resource provider with UUID 15673c9a-eee0-47b4-b3d3-728a0fedb147 and name compute-0.ctlplane.example.com.#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.166 187518 DEBUG nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 19:48:19 np0005539279 nova_compute[187514]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.167 187518 INFO nova.virt.libvirt.host [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.167 187518 DEBUG nova.compute.provider_tree [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.168 187518 DEBUG nova.virt.libvirt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.225 187518 DEBUG nova.scheduler.client.report [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Updated inventory for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.226 187518 DEBUG nova.compute.provider_tree [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Updating resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.226 187518 DEBUG nova.compute.provider_tree [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.332 187518 DEBUG nova.compute.provider_tree [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Updating resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.392 187518 DEBUG nova.compute.resource_tracker [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.393 187518 DEBUG oslo_concurrency.lockutils [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.393 187518 DEBUG nova.service [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.491 187518 DEBUG nova.service [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 28 19:48:19 np0005539279 nova_compute[187514]: 2025-11-29 00:48:19.491 187518 DEBUG nova.servicegroup.drivers.db [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 28 19:48:22 np0005539279 systemd-logind[811]: New session 26 of user zuul.
Nov 28 19:48:22 np0005539279 systemd[1]: Started Session 26 of User zuul.
Nov 28 19:48:23 np0005539279 python3.9[187985]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 19:48:25 np0005539279 python3.9[188141]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:48:25 np0005539279 systemd[1]: Reloading.
Nov 28 19:48:25 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:48:25 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:48:26 np0005539279 python3.9[188327]: ansible-ansible.builtin.service_facts Invoked
Nov 28 19:48:26 np0005539279 network[188344]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 19:48:26 np0005539279 network[188345]: 'network-scripts' will be removed from distribution in near future.
Nov 28 19:48:26 np0005539279 network[188346]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 19:48:29 np0005539279 podman[188447]: 2025-11-29 00:48:29.990406862 +0000 UTC m=+0.090221110 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 28 19:48:31 np0005539279 python3.9[188641]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:48:32 np0005539279 python3.9[188794]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:32 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 19:48:33 np0005539279 python3.9[188947]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:34 np0005539279 python3.9[189101]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:48:35 np0005539279 python3.9[189253]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 19:48:36 np0005539279 python3.9[189405]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:48:36 np0005539279 systemd[1]: Reloading.
Nov 28 19:48:36 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:48:36 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:48:38 np0005539279 python3.9[189592]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:48:38 np0005539279 python3.9[189745]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:48:39 np0005539279 python3.9[189897]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:48:40 np0005539279 python3.9[190049]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:41 np0005539279 python3.9[190170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377320.0655565-133-64068967521388/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:48:42 np0005539279 python3.9[190324]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 28 19:48:43 np0005539279 podman[190450]: 2025-11-29 00:48:43.638120699 +0000 UTC m=+0.152114209 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:48:43 np0005539279 python3.9[190494]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 28 19:48:44 np0005539279 python3.9[190656]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 19:48:44 np0005539279 auditd[702]: Audit daemon rotating log files
Nov 28 19:48:45 np0005539279 podman[190786]: 2025-11-29 00:48:45.740156601 +0000 UTC m=+0.095876350 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 19:48:45 np0005539279 python3.9[190834]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 19:48:47 np0005539279 python3.9[190992]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:48 np0005539279 python3.9[191113]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764377326.9205964-201-2673575362458/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:48 np0005539279 python3.9[191263]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:49 np0005539279 python3.9[191384]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764377328.397536-201-194963606629329/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:50 np0005539279 python3.9[191534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:51 np0005539279 python3.9[191657]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764377329.7577364-201-272586392482322/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:51 np0005539279 python3.9[191807]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:48:52 np0005539279 python3.9[191959]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:48:53 np0005539279 python3.9[192111]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:54 np0005539279 python3.9[192232]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377332.7865043-260-276490234809190/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:54 np0005539279 python3.9[192382]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:55 np0005539279 python3.9[192458]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:56 np0005539279 python3.9[192608]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:56 np0005539279 python3.9[192729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377335.5118952-260-216636785521336/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:57 np0005539279 python3.9[192879]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:58 np0005539279 python3.9[193002]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377337.111421-260-2712948152955/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:48:59 np0005539279 python3.9[193152]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:48:59 np0005539279 python3.9[193273]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377338.4881387-260-214777189389268/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:00 np0005539279 podman[193397]: 2025-11-29 00:49:00.300802943 +0000 UTC m=+0.103839523 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 19:49:00 np0005539279 python3.9[193433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:01 np0005539279 python3.9[193564]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377339.8392339-260-276949288177162/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:01 np0005539279 nova_compute[187514]: 2025-11-29 00:49:01.492 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:01 np0005539279 nova_compute[187514]: 2025-11-29 00:49:01.523 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:01 np0005539279 python3.9[193714]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:02 np0005539279 python3.9[193835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377341.3259482-260-223672097675887/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:03 np0005539279 python3.9[193985]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:03 np0005539279 python3.9[194106]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377342.584666-260-57516828074657/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:04 np0005539279 python3.9[194256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:04 np0005539279 python3.9[194377]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377343.8683596-260-147623709012209/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:05 np0005539279 python3.9[194527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:06 np0005539279 python3.9[194650]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377345.1659343-260-187591120667578/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:07 np0005539279 python3.9[194800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:07 np0005539279 python3.9[194921]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377346.62704-260-79859737446671/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:49:08.079 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:49:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:49:08.079 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:49:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:49:08.079 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:49:08 np0005539279 python3.9[195071]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:09 np0005539279 python3.9[195149]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:10 np0005539279 python3.9[195299]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:10 np0005539279 python3.9[195375]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:11 np0005539279 python3.9[195525]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:12 np0005539279 python3.9[195601]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:13 np0005539279 python3.9[195753]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:13 np0005539279 podman[195877]: 2025-11-29 00:49:13.82414464 +0000 UTC m=+0.137069164 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Nov 28 19:49:14 np0005539279 python3.9[195923]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:14 np0005539279 python3.9[196081]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:49:15 np0005539279 python3.9[196233]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:49:15 np0005539279 podman[196236]: 2025-11-29 00:49:15.86034784 +0000 UTC m=+0.070784444 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.611 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.612 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.612 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.631 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.631 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.632 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.632 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.633 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.633 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.634 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.634 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.635 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.668 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.669 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.669 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.669 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:49:16 np0005539279 systemd[1]: Reloading.
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.919 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.921 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6157MB free_disk=73.54384231567383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.922 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.922 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:49:16 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:49:16 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.995 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:49:16 np0005539279 nova_compute[187514]: 2025-11-29 00:49:16.996 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:49:17 np0005539279 nova_compute[187514]: 2025-11-29 00:49:17.021 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:49:17 np0005539279 nova_compute[187514]: 2025-11-29 00:49:17.039 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:49:17 np0005539279 nova_compute[187514]: 2025-11-29 00:49:17.041 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:49:17 np0005539279 nova_compute[187514]: 2025-11-29 00:49:17.041 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:49:17 np0005539279 systemd[1]: Listening on Podman API Socket.
Nov 28 19:49:18 np0005539279 python3.9[196442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:18 np0005539279 python3.9[196565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377357.580496-482-85459419486021/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:49:19 np0005539279 python3.9[196641]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:19 np0005539279 python3.9[196764]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377357.580496-482-85459419486021/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:49:21 np0005539279 python3.9[196916]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 28 19:49:22 np0005539279 python3.9[197068]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:49:23 np0005539279 python3[197220]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:49:23 np0005539279 podman[197255]: 2025-11-29 00:49:23.804922763 +0000 UTC m=+0.079805401 container create 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 19:49:23 np0005539279 podman[197255]: 2025-11-29 00:49:23.7668429 +0000 UTC m=+0.041725618 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 19:49:23 np0005539279 python3[197220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 28 19:49:24 np0005539279 python3.9[197445]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:49:25 np0005539279 python3.9[197599]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:26 np0005539279 python3.9[197750]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764377365.8696642-546-255998685670489/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:27 np0005539279 python3.9[197828]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:49:27 np0005539279 systemd[1]: Reloading.
Nov 28 19:49:27 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:49:27 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:49:28 np0005539279 python3.9[197939]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:49:28 np0005539279 systemd[1]: Reloading.
Nov 28 19:49:28 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:49:28 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:49:29 np0005539279 systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 19:49:29 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:49:29 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:29 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:29 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:29 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:29 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.
Nov 28 19:49:29 np0005539279 podman[197979]: 2025-11-29 00:49:29.290835003 +0000 UTC m=+0.129552113 container init 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + sudo -E kolla_set_configs
Nov 28 19:49:29 np0005539279 podman[197979]: 2025-11-29 00:49:29.330058337 +0000 UTC m=+0.168775397 container start 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: sudo: unable to send audit message: Operation not permitted
Nov 28 19:49:29 np0005539279 podman[197979]: ceilometer_agent_compute
Nov 28 19:49:29 np0005539279 systemd[1]: Started ceilometer_agent_compute container.
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Validating config file
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Copying service configuration files
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: INFO:__main__:Writing out command to execute
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: ++ cat /run_command
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + ARGS=
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + sudo kolla_copy_cacerts
Nov 28 19:49:29 np0005539279 podman[198002]: 2025-11-29 00:49:29.424058916 +0000 UTC m=+0.074559293 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: sudo: unable to send audit message: Operation not permitted
Nov 28 19:49:29 np0005539279 systemd[1]: 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0-38414c3d9cc1f4ec.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 19:49:29 np0005539279 systemd[1]: 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0-38414c3d9cc1f4ec.service: Failed with result 'exit-code'.
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + [[ ! -n '' ]]
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + . kolla_extend_start
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + umask 0022
Nov 28 19:49:29 np0005539279 ceilometer_agent_compute[197995]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.234 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.234 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.234 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.235 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.236 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.237 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.238 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.239 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.240 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.241 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.242 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.243 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.244 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.245 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.246 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.247 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.248 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.249 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.249 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.267 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.268 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.268 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.341 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 19:49:30 np0005539279 python3.9[198179]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:49:30 np0005539279 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.433 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.433 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.433 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.433 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.433 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.433 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.434 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.434 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.434 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.434 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.434 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.434 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.434 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.435 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.435 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.435 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.435 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.435 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.435 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.435 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.435 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.436 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.437 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.438 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.438 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.438 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.438 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.438 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.438 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.438 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.438 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.439 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.439 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.439 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.439 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.439 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.439 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.439 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.439 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.440 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.441 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.441 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.441 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.441 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.441 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.441 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.441 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.441 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.442 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.442 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.442 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.442 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.442 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.442 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 podman[198183]: 2025-11-29 00:49:30.44428577 +0000 UTC m=+0.076280409 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.442 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.442 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.443 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.443 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.443 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.443 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.443 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.443 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.443 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.443 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.444 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.444 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.444 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.444 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.444 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.444 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.444 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.444 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.445 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.446 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.446 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.446 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.446 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.446 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.446 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.446 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.446 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.447 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.447 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.447 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.447 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.447 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.447 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.447 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.447 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.448 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.448 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.448 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.448 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.448 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.448 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.448 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.449 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.449 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.449 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.449 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.449 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.449 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.449 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.449 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.450 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.451 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.451 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.451 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.451 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.452 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.453 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.454 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.455 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.456 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.457 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.458 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.458 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.458 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.461 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.464 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.469 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.474 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.474 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.474 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.475 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.475 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.475 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.475 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.475 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.475 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.475 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.476 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.476 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.476 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.476 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.476 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.476 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.476 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.477 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.477 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.477 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.477 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.477 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.477 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.478 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.478 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.565 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.565 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.565 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 28 19:49:30 np0005539279 ceilometer_agent_compute[197995]: 2025-11-29 00:49:30.573 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 28 19:49:30 np0005539279 virtqemud[187089]: End of file while reading data: Input/output error
Nov 28 19:49:30 np0005539279 virtqemud[187089]: End of file while reading data: Input/output error
Nov 28 19:49:30 np0005539279 systemd[1]: libpod-120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.scope: Deactivated successfully.
Nov 28 19:49:30 np0005539279 systemd[1]: libpod-120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.scope: Consumed 1.320s CPU time.
Nov 28 19:49:30 np0005539279 podman[198201]: 2025-11-29 00:49:30.722847434 +0000 UTC m=+0.298489255 container died 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 19:49:30 np0005539279 systemd[1]: 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0-38414c3d9cc1f4ec.timer: Deactivated successfully.
Nov 28 19:49:30 np0005539279 systemd[1]: Stopped /usr/bin/podman healthcheck run 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.
Nov 28 19:49:30 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0-userdata-shm.mount: Deactivated successfully.
Nov 28 19:49:30 np0005539279 systemd[1]: var-lib-containers-storage-overlay-1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb-merged.mount: Deactivated successfully.
Nov 28 19:49:30 np0005539279 podman[198201]: 2025-11-29 00:49:30.784365792 +0000 UTC m=+0.360007593 container cleanup 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 19:49:30 np0005539279 podman[198201]: ceilometer_agent_compute
Nov 28 19:49:30 np0005539279 podman[198239]: ceilometer_agent_compute
Nov 28 19:49:30 np0005539279 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 28 19:49:30 np0005539279 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 28 19:49:30 np0005539279 systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 19:49:31 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:49:31 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:31 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:31 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:31 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1676e4d4d9bb158cb3d854e9e20c4848c0a74e68eace24bab9f984f322b1a9cb/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:31 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.
Nov 28 19:49:31 np0005539279 podman[198251]: 2025-11-29 00:49:31.09334187 +0000 UTC m=+0.170237997 container init 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + sudo -E kolla_set_configs
Nov 28 19:49:31 np0005539279 podman[198251]: 2025-11-29 00:49:31.131089084 +0000 UTC m=+0.207985221 container start 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: sudo: unable to send audit message: Operation not permitted
Nov 28 19:49:31 np0005539279 podman[198251]: ceilometer_agent_compute
Nov 28 19:49:31 np0005539279 systemd[1]: Started ceilometer_agent_compute container.
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Validating config file
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Copying service configuration files
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: INFO:__main__:Writing out command to execute
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: ++ cat /run_command
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + ARGS=
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + sudo kolla_copy_cacerts
Nov 28 19:49:31 np0005539279 podman[198273]: 2025-11-29 00:49:31.239232846 +0000 UTC m=+0.087635977 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: sudo: unable to send audit message: Operation not permitted
Nov 28 19:49:31 np0005539279 systemd[1]: 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0-2a0f5b19067c9531.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 19:49:31 np0005539279 systemd[1]: 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0-2a0f5b19067c9531.service: Failed with result 'exit-code'.
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + [[ ! -n '' ]]
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + . kolla_extend_start
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + umask 0022
Nov 28 19:49:31 np0005539279 ceilometer_agent_compute[198266]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 28 19:49:31 np0005539279 python3.9[198450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.096 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.096 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.096 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.096 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.097 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.097 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.097 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.097 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.097 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.097 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.098 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.098 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.098 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.098 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.098 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.099 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.099 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.099 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.099 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.099 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.099 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.100 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.100 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.100 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.100 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.100 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.100 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.100 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.101 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.101 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.101 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.101 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.101 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.101 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.101 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.102 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.102 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.102 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.102 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.102 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.102 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.103 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.103 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.103 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.103 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.103 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.103 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.103 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.104 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.104 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.104 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.104 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.104 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.104 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.105 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.105 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.105 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.105 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.105 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.105 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.106 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.106 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.106 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.106 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.106 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.106 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.106 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.107 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.107 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.107 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.107 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.107 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.107 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.108 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.108 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.108 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.108 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.108 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.109 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.109 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.109 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.109 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.109 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.109 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.109 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.110 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.110 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.110 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.110 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.110 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.110 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.111 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.111 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.111 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.111 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.111 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.112 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.112 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.112 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.112 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.112 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.112 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.113 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.113 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.113 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.113 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.113 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.113 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.114 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.114 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.114 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.114 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.114 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.114 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.114 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.115 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.115 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.115 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.115 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.115 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.115 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.116 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.116 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.116 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.116 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.116 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.116 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.117 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.117 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.117 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.117 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.117 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.117 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.117 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.118 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.118 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.118 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.118 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.118 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.118 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.119 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.119 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.119 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.119 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.119 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.119 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.119 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.120 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.120 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.120 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.120 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.120 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.120 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.120 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.121 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.121 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.121 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.121 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.121 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.121 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.122 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.122 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.122 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.122 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.122 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.141 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.142 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.143 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.154 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.274 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.274 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.274 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.274 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.274 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.274 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.274 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.274 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.275 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.276 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.277 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.278 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.279 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.280 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.281 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.282 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.283 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.284 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.285 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.286 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.287 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.288 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.289 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.290 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.291 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.292 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.293 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.296 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.303 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:49:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:49:32 np0005539279 python3.9[198579]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377371.4179256-578-277010692346671/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:49:33 np0005539279 python3.9[198731]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 28 19:49:34 np0005539279 python3.9[198883]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:49:35 np0005539279 python3[199035]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:49:35 np0005539279 podman[199072]: 2025-11-29 00:49:35.957876055 +0000 UTC m=+0.063452381 container create b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible)
Nov 28 19:49:35 np0005539279 podman[199072]: 2025-11-29 00:49:35.922990239 +0000 UTC m=+0.028566655 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 28 19:49:35 np0005539279 python3[199035]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 28 19:49:36 np0005539279 python3.9[199262]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:49:38 np0005539279 python3.9[199416]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:39 np0005539279 python3.9[199569]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764377378.9006813-631-257422189182348/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:40 np0005539279 python3.9[199645]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:49:40 np0005539279 systemd[1]: Reloading.
Nov 28 19:49:40 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:49:40 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:49:41 np0005539279 python3.9[199756]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:49:41 np0005539279 systemd[1]: Reloading.
Nov 28 19:49:41 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:49:41 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:49:41 np0005539279 systemd[1]: Starting node_exporter container...
Nov 28 19:49:41 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:49:41 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebbdcfb93a035d2e169319b81b33a17a2e7fc49074163d7215d874ae5a0dee9b/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:41 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebbdcfb93a035d2e169319b81b33a17a2e7fc49074163d7215d874ae5a0dee9b/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:41 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b.
Nov 28 19:49:41 np0005539279 podman[199796]: 2025-11-29 00:49:41.96898239 +0000 UTC m=+0.164954854 container init b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.986Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.987Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.987Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.988Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.988Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.988Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.988Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.989Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.989Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.989Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.989Z caller=node_exporter.go:117 level=info collector=arp
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.989Z caller=node_exporter.go:117 level=info collector=bcache
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.989Z caller=node_exporter.go:117 level=info collector=bonding
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=cpu
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=edac
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=filefd
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=netclass
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=netdev
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=netstat
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=nfs
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=nvme
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=softnet
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=systemd
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=xfs
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.990Z caller=node_exporter.go:117 level=info collector=zfs
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.991Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 28 19:49:41 np0005539279 node_exporter[199811]: ts=2025-11-29T00:49:41.992Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 28 19:49:41 np0005539279 podman[199796]: 2025-11-29 00:49:41.995457888 +0000 UTC m=+0.191430362 container start b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:49:41 np0005539279 podman[199796]: node_exporter
Nov 28 19:49:42 np0005539279 systemd[1]: Started node_exporter container.
Nov 28 19:49:42 np0005539279 podman[199820]: 2025-11-29 00:49:42.074706727 +0000 UTC m=+0.070034650 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:49:43 np0005539279 python3.9[199995]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:49:43 np0005539279 systemd[1]: Stopping node_exporter container...
Nov 28 19:49:43 np0005539279 systemd[1]: libpod-b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b.scope: Deactivated successfully.
Nov 28 19:49:43 np0005539279 podman[199999]: 2025-11-29 00:49:43.151511946 +0000 UTC m=+0.069608369 container died b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 19:49:43 np0005539279 systemd[1]: b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b-6c8be3ea488ed586.timer: Deactivated successfully.
Nov 28 19:49:43 np0005539279 systemd[1]: Stopped /usr/bin/podman healthcheck run b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b.
Nov 28 19:49:43 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b-userdata-shm.mount: Deactivated successfully.
Nov 28 19:49:43 np0005539279 systemd[1]: var-lib-containers-storage-overlay-ebbdcfb93a035d2e169319b81b33a17a2e7fc49074163d7215d874ae5a0dee9b-merged.mount: Deactivated successfully.
Nov 28 19:49:43 np0005539279 podman[199999]: 2025-11-29 00:49:43.209626261 +0000 UTC m=+0.127722694 container cleanup b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:49:43 np0005539279 podman[199999]: node_exporter
Nov 28 19:49:43 np0005539279 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 19:49:43 np0005539279 podman[200028]: node_exporter
Nov 28 19:49:43 np0005539279 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 28 19:49:43 np0005539279 systemd[1]: Stopped node_exporter container.
Nov 28 19:49:43 np0005539279 systemd[1]: Starting node_exporter container...
Nov 28 19:49:43 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:49:43 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebbdcfb93a035d2e169319b81b33a17a2e7fc49074163d7215d874ae5a0dee9b/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:43 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebbdcfb93a035d2e169319b81b33a17a2e7fc49074163d7215d874ae5a0dee9b/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:43 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b.
Nov 28 19:49:43 np0005539279 podman[200041]: 2025-11-29 00:49:43.46772717 +0000 UTC m=+0.155490797 container init b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.491Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.491Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.491Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.491Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.491Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.491Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.491Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.492Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.492Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=arp
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=bcache
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=bonding
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=cpu
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=edac
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=filefd
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=netclass
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=netdev
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=netstat
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=nfs
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=nvme
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=softnet
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=systemd
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=xfs
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.493Z caller=node_exporter.go:117 level=info collector=zfs
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.494Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 28 19:49:43 np0005539279 node_exporter[200056]: ts=2025-11-29T00:49:43.495Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 28 19:49:43 np0005539279 podman[200041]: 2025-11-29 00:49:43.512552765 +0000 UTC m=+0.200316392 container start b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 19:49:43 np0005539279 podman[200041]: node_exporter
Nov 28 19:49:43 np0005539279 systemd[1]: Started node_exporter container.
Nov 28 19:49:43 np0005539279 podman[200065]: 2025-11-29 00:49:43.636862845 +0000 UTC m=+0.107608389 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:49:44 np0005539279 podman[200212]: 2025-11-29 00:49:44.266107798 +0000 UTC m=+0.134390656 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 28 19:49:44 np0005539279 python3.9[200260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:45 np0005539279 python3.9[200388]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377383.786216-663-6441739987919/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:49:46 np0005539279 podman[200512]: 2025-11-29 00:49:46.000649782 +0000 UTC m=+0.088396258 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 19:49:46 np0005539279 python3.9[200557]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 28 19:49:47 np0005539279 python3.9[200709]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:49:48 np0005539279 python3[200861]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:49:49 np0005539279 podman[200874]: 2025-11-29 00:49:49.8319369 +0000 UTC m=+1.549189099 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 28 19:49:50 np0005539279 podman[200971]: 2025-11-29 00:49:50.061722581 +0000 UTC m=+0.071666814 container create 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible)
Nov 28 19:49:50 np0005539279 podman[200971]: 2025-11-29 00:49:50.022523818 +0000 UTC m=+0.032468111 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 28 19:49:50 np0005539279 python3[200861]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 28 19:49:51 np0005539279 python3.9[201161]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:49:52 np0005539279 python3.9[201315]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:53 np0005539279 python3.9[201468]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764377392.194649-716-205433253838237/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:49:53 np0005539279 python3.9[201545]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:49:53 np0005539279 systemd[1]: Reloading.
Nov 28 19:49:53 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:49:53 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:49:54 np0005539279 python3.9[201657]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:49:54 np0005539279 systemd[1]: Reloading.
Nov 28 19:49:54 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:49:54 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:49:55 np0005539279 systemd[1]: Starting podman_exporter container...
Nov 28 19:49:55 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:49:55 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/166592ee0b740ceb2d3654b773f296775a50c9a26ad2d2ec44b6c2e8a5df2c26/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:55 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/166592ee0b740ceb2d3654b773f296775a50c9a26ad2d2ec44b6c2e8a5df2c26/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:55 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d.
Nov 28 19:49:55 np0005539279 podman[201698]: 2025-11-29 00:49:55.375700473 +0000 UTC m=+0.190361983 container init 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:49:55 np0005539279 podman_exporter[201714]: ts=2025-11-29T00:49:55.402Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 19:49:55 np0005539279 podman_exporter[201714]: ts=2025-11-29T00:49:55.402Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 19:49:55 np0005539279 podman_exporter[201714]: ts=2025-11-29T00:49:55.402Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 19:49:55 np0005539279 podman_exporter[201714]: ts=2025-11-29T00:49:55.402Z caller=handler.go:105 level=info collector=container
Nov 28 19:49:55 np0005539279 podman[201698]: 2025-11-29 00:49:55.414080463 +0000 UTC m=+0.228741923 container start 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:49:55 np0005539279 podman[201698]: podman_exporter
Nov 28 19:49:55 np0005539279 systemd[1]: Starting Podman API Service...
Nov 28 19:49:55 np0005539279 systemd[1]: Started podman_exporter container.
Nov 28 19:49:55 np0005539279 systemd[1]: Started Podman API Service.
Nov 28 19:49:55 np0005539279 podman[201725]: time="2025-11-29T00:49:55Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 28 19:49:55 np0005539279 podman[201725]: time="2025-11-29T00:49:55Z" level=info msg="Setting parallel job count to 25"
Nov 28 19:49:55 np0005539279 podman[201725]: time="2025-11-29T00:49:55Z" level=info msg="Using sqlite as database backend"
Nov 28 19:49:55 np0005539279 podman[201725]: time="2025-11-29T00:49:55Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 28 19:49:55 np0005539279 podman[201725]: time="2025-11-29T00:49:55Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 28 19:49:55 np0005539279 podman[201725]: time="2025-11-29T00:49:55Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 28 19:49:55 np0005539279 podman[201725]: @ - - [29/Nov/2025:00:49:55 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 19:49:55 np0005539279 podman[201725]: time="2025-11-29T00:49:55Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 19:49:55 np0005539279 podman[201723]: 2025-11-29 00:49:55.522632727 +0000 UTC m=+0.084355528 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 19:49:55 np0005539279 podman[201725]: @ - - [29/Nov/2025:00:49:55 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19567 "" "Go-http-client/1.1"
Nov 28 19:49:55 np0005539279 podman_exporter[201714]: ts=2025-11-29T00:49:55.529Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 28 19:49:55 np0005539279 podman_exporter[201714]: ts=2025-11-29T00:49:55.529Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 28 19:49:55 np0005539279 podman_exporter[201714]: ts=2025-11-29T00:49:55.530Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 28 19:49:55 np0005539279 systemd[1]: 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d-331a0baec20e1b67.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 19:49:55 np0005539279 systemd[1]: 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d-331a0baec20e1b67.service: Failed with result 'exit-code'.
Nov 28 19:49:56 np0005539279 python3.9[201913]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:49:56 np0005539279 systemd[1]: Stopping podman_exporter container...
Nov 28 19:49:56 np0005539279 podman[201725]: @ - - [29/Nov/2025:00:49:55 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Nov 28 19:49:56 np0005539279 systemd[1]: libpod-5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d.scope: Deactivated successfully.
Nov 28 19:49:56 np0005539279 podman[201917]: 2025-11-29 00:49:56.544371222 +0000 UTC m=+0.055294640 container died 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:49:56 np0005539279 systemd[1]: 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d-331a0baec20e1b67.timer: Deactivated successfully.
Nov 28 19:49:56 np0005539279 systemd[1]: Stopped /usr/bin/podman healthcheck run 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d.
Nov 28 19:49:56 np0005539279 systemd[1]: var-lib-containers-storage-overlay-166592ee0b740ceb2d3654b773f296775a50c9a26ad2d2ec44b6c2e8a5df2c26-merged.mount: Deactivated successfully.
Nov 28 19:49:56 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d-userdata-shm.mount: Deactivated successfully.
Nov 28 19:49:56 np0005539279 podman[201917]: 2025-11-29 00:49:56.828319712 +0000 UTC m=+0.339243160 container cleanup 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:49:56 np0005539279 podman[201917]: podman_exporter
Nov 28 19:49:56 np0005539279 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 19:49:56 np0005539279 podman[201944]: podman_exporter
Nov 28 19:49:56 np0005539279 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 28 19:49:56 np0005539279 systemd[1]: Stopped podman_exporter container.
Nov 28 19:49:56 np0005539279 systemd[1]: Starting podman_exporter container...
Nov 28 19:49:57 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:49:57 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/166592ee0b740ceb2d3654b773f296775a50c9a26ad2d2ec44b6c2e8a5df2c26/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:57 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/166592ee0b740ceb2d3654b773f296775a50c9a26ad2d2ec44b6c2e8a5df2c26/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 19:49:57 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d.
Nov 28 19:49:57 np0005539279 podman[201957]: 2025-11-29 00:49:57.107348848 +0000 UTC m=+0.154462269 container init 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:49:57 np0005539279 podman_exporter[201973]: ts=2025-11-29T00:49:57.129Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 19:49:57 np0005539279 podman_exporter[201973]: ts=2025-11-29T00:49:57.129Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 19:49:57 np0005539279 podman_exporter[201973]: ts=2025-11-29T00:49:57.129Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 19:49:57 np0005539279 podman_exporter[201973]: ts=2025-11-29T00:49:57.129Z caller=handler.go:105 level=info collector=container
Nov 28 19:49:57 np0005539279 podman[201725]: @ - - [29/Nov/2025:00:49:57 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 19:49:57 np0005539279 podman[201725]: time="2025-11-29T00:49:57Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 19:49:57 np0005539279 podman[201957]: 2025-11-29 00:49:57.151371002 +0000 UTC m=+0.198484353 container start 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:49:57 np0005539279 podman[201957]: podman_exporter
Nov 28 19:49:57 np0005539279 podman[201725]: @ - - [29/Nov/2025:00:49:57 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19569 "" "Go-http-client/1.1"
Nov 28 19:49:57 np0005539279 podman_exporter[201973]: ts=2025-11-29T00:49:57.160Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 28 19:49:57 np0005539279 podman_exporter[201973]: ts=2025-11-29T00:49:57.161Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 28 19:49:57 np0005539279 podman_exporter[201973]: ts=2025-11-29T00:49:57.161Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 28 19:49:57 np0005539279 systemd[1]: Started podman_exporter container.
Nov 28 19:49:57 np0005539279 podman[201983]: 2025-11-29 00:49:57.272685881 +0000 UTC m=+0.096580329 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:49:57 np0005539279 python3.9[202158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:49:58 np0005539279 python3.9[202283]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764377397.3939877-748-210758782859171/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 19:49:59 np0005539279 python3.9[202435]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 28 19:50:00 np0005539279 python3.9[202587]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 19:50:00 np0005539279 podman[202612]: 2025-11-29 00:50:00.865851173 +0000 UTC m=+0.105865352 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Nov 28 19:50:01 np0005539279 podman[202734]: 2025-11-29 00:50:01.393191751 +0000 UTC m=+0.081169961 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 19:50:01 np0005539279 systemd[1]: 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0-2a0f5b19067c9531.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 19:50:01 np0005539279 systemd[1]: 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0-2a0f5b19067c9531.service: Failed with result 'exit-code'.
Nov 28 19:50:01 np0005539279 python3[202781]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 19:50:04 np0005539279 podman[202795]: 2025-11-29 00:50:04.455022695 +0000 UTC m=+2.655882217 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 19:50:04 np0005539279 podman[202894]: 2025-11-29 00:50:04.698099117 +0000 UTC m=+0.072708843 container create 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Nov 28 19:50:04 np0005539279 podman[202894]: 2025-11-29 00:50:04.669292236 +0000 UTC m=+0.043901962 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 19:50:04 np0005539279 python3[202781]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 19:50:05 np0005539279 python3.9[203084]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:50:06 np0005539279 python3.9[203238]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:07 np0005539279 python3.9[203389]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764377406.8001795-801-144087374239602/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:50:08.079 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:50:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:50:08.080 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:50:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:50:08.080 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:50:08 np0005539279 python3.9[203466]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 19:50:08 np0005539279 systemd[1]: Reloading.
Nov 28 19:50:08 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:50:08 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:50:09 np0005539279 python3.9[203579]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 19:50:09 np0005539279 systemd[1]: Reloading.
Nov 28 19:50:09 np0005539279 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 19:50:09 np0005539279 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 19:50:09 np0005539279 systemd[1]: Starting openstack_network_exporter container...
Nov 28 19:50:09 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:50:09 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6775beafe8d60eaf83b60cd38fa6cd68c6728ddc1f47fddb5bc2670da6c65cf6/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 19:50:09 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6775beafe8d60eaf83b60cd38fa6cd68c6728ddc1f47fddb5bc2670da6c65cf6/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 19:50:09 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6775beafe8d60eaf83b60cd38fa6cd68c6728ddc1f47fddb5bc2670da6c65cf6/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 19:50:09 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352.
Nov 28 19:50:09 np0005539279 podman[203619]: 2025-11-29 00:50:09.860799546 +0000 UTC m=+0.179520049 container init 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *bridge.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *coverage.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *datapath.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *iface.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *memory.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *ovnnorthd.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *ovn.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *ovsdbserver.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *pmd_perf.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *pmd_rxq.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: INFO    00:50:09 main.go:48: registering *vswitch.Collector
Nov 28 19:50:09 np0005539279 openstack_network_exporter[203635]: NOTICE  00:50:09 main.go:76: listening on https://:9105/metrics
Nov 28 19:50:09 np0005539279 podman[203619]: 2025-11-29 00:50:09.901099609 +0000 UTC m=+0.219820062 container start 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 28 19:50:09 np0005539279 podman[203619]: openstack_network_exporter
Nov 28 19:50:09 np0005539279 systemd[1]: Started openstack_network_exporter container.
Nov 28 19:50:10 np0005539279 podman[203645]: 2025-11-29 00:50:10.028962856 +0000 UTC m=+0.104511805 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 19:50:11 np0005539279 python3.9[203820]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 19:50:11 np0005539279 systemd[1]: Stopping openstack_network_exporter container...
Nov 28 19:50:11 np0005539279 systemd[1]: libpod-31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352.scope: Deactivated successfully.
Nov 28 19:50:11 np0005539279 podman[203824]: 2025-11-29 00:50:11.586868571 +0000 UTC m=+0.049798572 container died 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 19:50:11 np0005539279 systemd[1]: 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352-47b1203937332b7e.timer: Deactivated successfully.
Nov 28 19:50:11 np0005539279 systemd[1]: Stopped /usr/bin/podman healthcheck run 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352.
Nov 28 19:50:11 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352-userdata-shm.mount: Deactivated successfully.
Nov 28 19:50:11 np0005539279 systemd[1]: var-lib-containers-storage-overlay-6775beafe8d60eaf83b60cd38fa6cd68c6728ddc1f47fddb5bc2670da6c65cf6-merged.mount: Deactivated successfully.
Nov 28 19:50:12 np0005539279 podman[203824]: 2025-11-29 00:50:12.503243707 +0000 UTC m=+0.966173708 container cleanup 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 19:50:12 np0005539279 podman[203824]: openstack_network_exporter
Nov 28 19:50:12 np0005539279 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 19:50:12 np0005539279 podman[203851]: openstack_network_exporter
Nov 28 19:50:12 np0005539279 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 28 19:50:12 np0005539279 systemd[1]: Stopped openstack_network_exporter container.
Nov 28 19:50:12 np0005539279 systemd[1]: Starting openstack_network_exporter container...
Nov 28 19:50:12 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:50:12 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6775beafe8d60eaf83b60cd38fa6cd68c6728ddc1f47fddb5bc2670da6c65cf6/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 19:50:12 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6775beafe8d60eaf83b60cd38fa6cd68c6728ddc1f47fddb5bc2670da6c65cf6/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 19:50:12 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6775beafe8d60eaf83b60cd38fa6cd68c6728ddc1f47fddb5bc2670da6c65cf6/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 19:50:12 np0005539279 systemd[1]: Started /usr/bin/podman healthcheck run 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352.
Nov 28 19:50:12 np0005539279 podman[203864]: 2025-11-29 00:50:12.789266643 +0000 UTC m=+0.162817396 container init 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=edpm, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc.)
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *bridge.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *coverage.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *datapath.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *iface.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *memory.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *ovnnorthd.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *ovn.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *ovsdbserver.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *pmd_perf.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *pmd_rxq.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: INFO    00:50:12 main.go:48: registering *vswitch.Collector
Nov 28 19:50:12 np0005539279 openstack_network_exporter[203879]: NOTICE  00:50:12 main.go:76: listening on https://:9105/metrics
Nov 28 19:50:12 np0005539279 podman[203864]: 2025-11-29 00:50:12.839203407 +0000 UTC m=+0.212754130 container start 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 19:50:12 np0005539279 podman[203864]: openstack_network_exporter
Nov 28 19:50:12 np0005539279 systemd[1]: Started openstack_network_exporter container.
Nov 28 19:50:12 np0005539279 podman[203889]: 2025-11-29 00:50:12.942722744 +0000 UTC m=+0.086324961 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 28 19:50:13 np0005539279 python3.9[204062]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 19:50:13 np0005539279 podman[204063]: 2025-11-29 00:50:13.825112891 +0000 UTC m=+0.078073048 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:50:14 np0005539279 podman[204209]: 2025-11-29 00:50:14.894680194 +0000 UTC m=+0.133223304 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 19:50:15 np0005539279 python3.9[204254]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 28 19:50:16 np0005539279 python3.9[204428]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:16 np0005539279 systemd[1]: Started libpod-conmon-0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f.scope.
Nov 28 19:50:16 np0005539279 podman[204429]: 2025-11-29 00:50:16.261203808 +0000 UTC m=+0.107961949 container exec 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 19:50:16 np0005539279 podman[204429]: 2025-11-29 00:50:16.269093672 +0000 UTC m=+0.115851823 container exec_died 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 28 19:50:16 np0005539279 systemd[1]: libpod-conmon-0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f.scope: Deactivated successfully.
Nov 28 19:50:16 np0005539279 podman[204445]: 2025-11-29 00:50:16.376782092 +0000 UTC m=+0.121810254 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.034 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.035 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.067 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.067 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.068 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.069 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.069 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.105 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.106 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.107 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.107 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:50:17 np0005539279 python3.9[204629]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:17 np0005539279 systemd[1]: Started libpod-conmon-0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f.scope.
Nov 28 19:50:17 np0005539279 podman[204630]: 2025-11-29 00:50:17.28173379 +0000 UTC m=+0.115128083 container exec 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 19:50:17 np0005539279 podman[204630]: 2025-11-29 00:50:17.292965765 +0000 UTC m=+0.126360008 container exec_died 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.349 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.352 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5942MB free_disk=73.37424850463867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.353 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.353 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:50:17 np0005539279 systemd[1]: libpod-conmon-0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f.scope: Deactivated successfully.
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.451 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.451 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.484 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.505 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.508 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:50:17 np0005539279 nova_compute[187514]: 2025-11-29 00:50:17.508 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:50:18 np0005539279 python3.9[204813]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:19 np0005539279 nova_compute[187514]: 2025-11-29 00:50:19.049 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:19 np0005539279 nova_compute[187514]: 2025-11-29 00:50:19.049 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:50:19 np0005539279 nova_compute[187514]: 2025-11-29 00:50:19.050 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:50:19 np0005539279 nova_compute[187514]: 2025-11-29 00:50:19.076 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 19:50:19 np0005539279 nova_compute[187514]: 2025-11-29 00:50:19.076 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:19 np0005539279 nova_compute[187514]: 2025-11-29 00:50:19.076 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:19 np0005539279 nova_compute[187514]: 2025-11-29 00:50:19.076 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:50:19 np0005539279 python3.9[204965]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 28 19:50:20 np0005539279 python3.9[205130]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:20 np0005539279 systemd[1]: Started libpod-conmon-dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da.scope.
Nov 28 19:50:20 np0005539279 podman[205131]: 2025-11-29 00:50:20.399347476 +0000 UTC m=+0.107697671 container exec dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 19:50:20 np0005539279 podman[205131]: 2025-11-29 00:50:20.435004193 +0000 UTC m=+0.143354348 container exec_died dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 19:50:20 np0005539279 systemd[1]: libpod-conmon-dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da.scope: Deactivated successfully.
Nov 28 19:50:21 np0005539279 python3.9[205314]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:21 np0005539279 systemd[1]: Started libpod-conmon-dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da.scope.
Nov 28 19:50:21 np0005539279 podman[205315]: 2025-11-29 00:50:21.465525176 +0000 UTC m=+0.078739066 container exec dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 19:50:21 np0005539279 podman[205315]: 2025-11-29 00:50:21.502091787 +0000 UTC m=+0.115305607 container exec_died dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 19:50:21 np0005539279 systemd[1]: libpod-conmon-dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da.scope: Deactivated successfully.
Nov 28 19:50:22 np0005539279 python3.9[205499]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:23 np0005539279 python3.9[205651]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 28 19:50:24 np0005539279 python3.9[205815]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:24 np0005539279 systemd[1]: Started libpod-conmon-b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0.scope.
Nov 28 19:50:24 np0005539279 podman[205816]: 2025-11-29 00:50:24.326501323 +0000 UTC m=+0.095982364 container exec b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 28 19:50:24 np0005539279 podman[205816]: 2025-11-29 00:50:24.357873794 +0000 UTC m=+0.127354825 container exec_died b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 19:50:24 np0005539279 systemd[1]: libpod-conmon-b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0.scope: Deactivated successfully.
Nov 28 19:50:25 np0005539279 python3.9[205997]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:25 np0005539279 systemd[1]: Started libpod-conmon-b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0.scope.
Nov 28 19:50:25 np0005539279 podman[205998]: 2025-11-29 00:50:25.462131857 +0000 UTC m=+0.105293886 container exec b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 19:50:25 np0005539279 podman[205998]: 2025-11-29 00:50:25.494532945 +0000 UTC m=+0.137694934 container exec_died b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:50:25 np0005539279 systemd[1]: libpod-conmon-b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0.scope: Deactivated successfully.
Nov 28 19:50:26 np0005539279 python3.9[206178]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:27 np0005539279 python3.9[206330]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 28 19:50:27 np0005539279 podman[206443]: 2025-11-29 00:50:27.845915724 +0000 UTC m=+0.079542637 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:50:28 np0005539279 python3.9[206518]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:28 np0005539279 systemd[1]: Started libpod-conmon-120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.scope.
Nov 28 19:50:28 np0005539279 podman[206519]: 2025-11-29 00:50:28.299802332 +0000 UTC m=+0.110372344 container exec 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 19:50:28 np0005539279 podman[206519]: 2025-11-29 00:50:28.335090929 +0000 UTC m=+0.145660931 container exec_died 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:50:28 np0005539279 systemd[1]: libpod-conmon-120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.scope: Deactivated successfully.
Nov 28 19:50:29 np0005539279 python3.9[206700]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:29 np0005539279 systemd[1]: Started libpod-conmon-120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.scope.
Nov 28 19:50:29 np0005539279 podman[206701]: 2025-11-29 00:50:29.402430531 +0000 UTC m=+0.109724626 container exec 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 19:50:29 np0005539279 podman[206701]: 2025-11-29 00:50:29.438282943 +0000 UTC m=+0.145577008 container exec_died 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125)
Nov 28 19:50:29 np0005539279 systemd[1]: libpod-conmon-120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0.scope: Deactivated successfully.
Nov 28 19:50:30 np0005539279 python3.9[206883]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:31 np0005539279 podman[207009]: 2025-11-29 00:50:31.054878871 +0000 UTC m=+0.106330072 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:50:31 np0005539279 python3.9[207056]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 28 19:50:31 np0005539279 podman[207132]: 2025-11-29 00:50:31.79367193 +0000 UTC m=+0.078395327 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 19:50:32 np0005539279 python3.9[207243]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:32 np0005539279 systemd[1]: Started libpod-conmon-b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b.scope.
Nov 28 19:50:32 np0005539279 podman[207244]: 2025-11-29 00:50:32.315585654 +0000 UTC m=+0.104519187 container exec b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:50:32 np0005539279 podman[207244]: 2025-11-29 00:50:32.349633782 +0000 UTC m=+0.138567325 container exec_died b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:50:32 np0005539279 systemd[1]: libpod-conmon-b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b.scope: Deactivated successfully.
Nov 28 19:50:33 np0005539279 python3.9[207427]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:33 np0005539279 systemd[1]: Started libpod-conmon-b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b.scope.
Nov 28 19:50:33 np0005539279 podman[207428]: 2025-11-29 00:50:33.422506845 +0000 UTC m=+0.107310120 container exec b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:50:33 np0005539279 podman[207428]: 2025-11-29 00:50:33.456968901 +0000 UTC m=+0.141772126 container exec_died b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 19:50:33 np0005539279 systemd[1]: libpod-conmon-b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b.scope: Deactivated successfully.
Nov 28 19:50:34 np0005539279 python3.9[207612]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:35 np0005539279 python3.9[207764]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 28 19:50:36 np0005539279 python3.9[207929]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:36 np0005539279 systemd[1]: Started libpod-conmon-5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d.scope.
Nov 28 19:50:36 np0005539279 podman[207930]: 2025-11-29 00:50:36.292523326 +0000 UTC m=+0.112546861 container exec 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:50:36 np0005539279 podman[207930]: 2025-11-29 00:50:36.32739044 +0000 UTC m=+0.147413975 container exec_died 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:50:36 np0005539279 systemd[1]: libpod-conmon-5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d.scope: Deactivated successfully.
Nov 28 19:50:37 np0005539279 python3.9[208111]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:37 np0005539279 systemd[1]: Started libpod-conmon-5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d.scope.
Nov 28 19:50:37 np0005539279 podman[208112]: 2025-11-29 00:50:37.305602655 +0000 UTC m=+0.088870209 container exec 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:50:37 np0005539279 podman[208112]: 2025-11-29 00:50:37.339005558 +0000 UTC m=+0.122273062 container exec_died 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:50:37 np0005539279 systemd[1]: libpod-conmon-5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d.scope: Deactivated successfully.
Nov 28 19:50:38 np0005539279 python3.9[208294]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:39 np0005539279 python3.9[208446]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 28 19:50:40 np0005539279 python3.9[208612]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:40 np0005539279 systemd[1]: Started libpod-conmon-31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352.scope.
Nov 28 19:50:40 np0005539279 podman[208613]: 2025-11-29 00:50:40.231521912 +0000 UTC m=+0.099768980 container exec 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public)
Nov 28 19:50:40 np0005539279 podman[208613]: 2025-11-29 00:50:40.26180611 +0000 UTC m=+0.130053128 container exec_died 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 19:50:40 np0005539279 systemd[1]: libpod-conmon-31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352.scope: Deactivated successfully.
Nov 28 19:50:41 np0005539279 python3.9[208794]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 19:50:41 np0005539279 systemd[1]: Started libpod-conmon-31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352.scope.
Nov 28 19:50:41 np0005539279 podman[208795]: 2025-11-29 00:50:41.356777754 +0000 UTC m=+0.167035818 container exec 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 28 19:50:41 np0005539279 podman[208815]: 2025-11-29 00:50:41.449740774 +0000 UTC m=+0.071599788 container exec_died 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Nov 28 19:50:41 np0005539279 podman[208795]: 2025-11-29 00:50:41.456842454 +0000 UTC m=+0.267100458 container exec_died 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.6, vendor=Red Hat, Inc.)
Nov 28 19:50:41 np0005539279 systemd[1]: libpod-conmon-31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352.scope: Deactivated successfully.
Nov 28 19:50:42 np0005539279 python3.9[208980]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:43 np0005539279 podman[209104]: 2025-11-29 00:50:43.112736934 +0000 UTC m=+0.101138564 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Nov 28 19:50:43 np0005539279 python3.9[209147]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:43 np0005539279 podman[209276]: 2025-11-29 00:50:43.973531464 +0000 UTC m=+0.080928161 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:50:44 np0005539279 python3.9[209327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:44 np0005539279 python3.9[209450]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764377443.5367146-1082-272839410335024/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:45 np0005539279 podman[209574]: 2025-11-29 00:50:45.681975857 +0000 UTC m=+0.122701111 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:50:45 np0005539279 python3.9[209621]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:46 np0005539279 podman[209751]: 2025-11-29 00:50:46.543789397 +0000 UTC m=+0.081087781 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 28 19:50:46 np0005539279 python3.9[209798]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:47 np0005539279 python3.9[209878]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:48 np0005539279 python3.9[210030]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:48 np0005539279 python3.9[210108]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.bxz2zq5b recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:49 np0005539279 python3.9[210260]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:50 np0005539279 python3.9[210338]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:51 np0005539279 python3.9[210490]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:50:52 np0005539279 python3[210643]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 19:50:53 np0005539279 python3.9[210795]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:53 np0005539279 python3.9[210873]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:54 np0005539279 python3.9[211025]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:55 np0005539279 python3.9[211103]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:55 np0005539279 python3.9[211257]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:56 np0005539279 python3.9[211335]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:57 np0005539279 python3.9[211487]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:58 np0005539279 python3.9[211565]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:50:58 np0005539279 podman[211689]: 2025-11-29 00:50:58.778839637 +0000 UTC m=+0.087268678 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:50:58 np0005539279 python3.9[211742]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 19:50:59 np0005539279 python3.9[211867]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764377458.2599208-1207-249838044477692/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:51:00 np0005539279 python3.9[212019]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:51:01 np0005539279 podman[212172]: 2025-11-29 00:51:01.259763882 +0000 UTC m=+0.095624024 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=multipathd)
Nov 28 19:51:01 np0005539279 python3.9[212174]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:51:02 np0005539279 podman[212323]: 2025-11-29 00:51:02.300400831 +0000 UTC m=+0.096089545 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Nov 28 19:51:02 np0005539279 python3.9[212369]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:51:03 np0005539279 python3.9[212523]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:51:04 np0005539279 python3.9[212676]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 19:51:05 np0005539279 python3.9[212832]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 19:51:06 np0005539279 python3.9[212987]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 19:51:06 np0005539279 systemd[1]: session-26.scope: Deactivated successfully.
Nov 28 19:51:06 np0005539279 systemd[1]: session-26.scope: Consumed 2min 3.940s CPU time.
Nov 28 19:51:06 np0005539279 systemd-logind[811]: Session 26 logged out. Waiting for processes to exit.
Nov 28 19:51:06 np0005539279 systemd-logind[811]: Removed session 26.
Nov 28 19:51:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:51:08.080 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:51:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:51:08.080 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:51:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:51:08.081 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:51:13 np0005539279 podman[213014]: 2025-11-29 00:51:13.826460903 +0000 UTC m=+0.073775739 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 19:51:14 np0005539279 podman[213037]: 2025-11-29 00:51:14.839781451 +0000 UTC m=+0.073931963 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:51:15 np0005539279 podman[213063]: 2025-11-29 00:51:15.932386104 +0000 UTC m=+0.175029325 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.665 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.665 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.666 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.666 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:51:16 np0005539279 podman[213090]: 2025-11-29 00:51:16.854568194 +0000 UTC m=+0.092712792 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.923 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.924 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5961MB free_disk=73.37356948852539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.924 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:51:16 np0005539279 nova_compute[187514]: 2025-11-29 00:51:16.925 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:51:17 np0005539279 nova_compute[187514]: 2025-11-29 00:51:17.047 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:51:17 np0005539279 nova_compute[187514]: 2025-11-29 00:51:17.048 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:51:17 np0005539279 nova_compute[187514]: 2025-11-29 00:51:17.074 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:51:17 np0005539279 nova_compute[187514]: 2025-11-29 00:51:17.093 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:51:17 np0005539279 nova_compute[187514]: 2025-11-29 00:51:17.095 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:51:17 np0005539279 nova_compute[187514]: 2025-11-29 00:51:17.095 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:51:19 np0005539279 nova_compute[187514]: 2025-11-29 00:51:19.092 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:19 np0005539279 nova_compute[187514]: 2025-11-29 00:51:19.092 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:19 np0005539279 nova_compute[187514]: 2025-11-29 00:51:19.093 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:20 np0005539279 nova_compute[187514]: 2025-11-29 00:51:20.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:20 np0005539279 nova_compute[187514]: 2025-11-29 00:51:20.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:51:20 np0005539279 nova_compute[187514]: 2025-11-29 00:51:20.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:51:20 np0005539279 nova_compute[187514]: 2025-11-29 00:51:20.634 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 19:51:20 np0005539279 nova_compute[187514]: 2025-11-29 00:51:20.635 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:20 np0005539279 nova_compute[187514]: 2025-11-29 00:51:20.636 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:20 np0005539279 nova_compute[187514]: 2025-11-29 00:51:20.636 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:51:29 np0005539279 podman[213113]: 2025-11-29 00:51:29.855907432 +0000 UTC m=+0.074491789 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 19:51:31 np0005539279 podman[213138]: 2025-11-29 00:51:31.844919356 +0000 UTC m=+0.085644010 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:51:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:51:32 np0005539279 podman[213159]: 2025-11-29 00:51:32.845868908 +0000 UTC m=+0.087761770 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 19:51:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:51:41.453 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:51:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:51:41.456 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 19:51:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:51:41.459 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:51:44 np0005539279 podman[213181]: 2025-11-29 00:51:44.895949656 +0000 UTC m=+0.104208144 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm)
Nov 28 19:51:45 np0005539279 podman[213202]: 2025-11-29 00:51:45.012783785 +0000 UTC m=+0.078192976 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:51:46 np0005539279 podman[213226]: 2025-11-29 00:51:46.878549787 +0000 UTC m=+0.122654884 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2)
Nov 28 19:51:47 np0005539279 podman[213252]: 2025-11-29 00:51:47.003896111 +0000 UTC m=+0.085753382 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 19:52:00 np0005539279 podman[213276]: 2025-11-29 00:52:00.80326675 +0000 UTC m=+0.055044275 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:52:02 np0005539279 podman[213300]: 2025-11-29 00:52:02.827360361 +0000 UTC m=+0.073495996 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 19:52:03 np0005539279 podman[213320]: 2025-11-29 00:52:03.848456046 +0000 UTC m=+0.087482173 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 28 19:52:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:52:08.081 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:52:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:52:08.081 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:52:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:52:08.081 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:52:15 np0005539279 podman[213347]: 2025-11-29 00:52:15.846803738 +0000 UTC m=+0.080018631 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:52:15 np0005539279 podman[213346]: 2025-11-29 00:52:15.864355032 +0000 UTC m=+0.098040539 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, config_id=edpm, maintainer=Red Hat, Inc.)
Nov 28 19:52:16 np0005539279 nova_compute[187514]: 2025-11-29 00:52:16.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:17 np0005539279 nova_compute[187514]: 2025-11-29 00:52:17.192 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:17 np0005539279 nova_compute[187514]: 2025-11-29 00:52:17.192 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:52:17 np0005539279 podman[213392]: 2025-11-29 00:52:17.76749796 +0000 UTC m=+0.074748863 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:52:17 np0005539279 podman[213391]: 2025-11-29 00:52:17.808877406 +0000 UTC m=+0.123576482 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.669 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.670 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.670 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.670 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.888 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.889 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6028MB free_disk=73.37749481201172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.889 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.889 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.963 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.964 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:52:18 np0005539279 nova_compute[187514]: 2025-11-29 00:52:18.989 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:52:19 np0005539279 nova_compute[187514]: 2025-11-29 00:52:19.013 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:52:19 np0005539279 nova_compute[187514]: 2025-11-29 00:52:19.016 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:52:19 np0005539279 nova_compute[187514]: 2025-11-29 00:52:19.016 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:52:20 np0005539279 nova_compute[187514]: 2025-11-29 00:52:20.016 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:20 np0005539279 nova_compute[187514]: 2025-11-29 00:52:20.606 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:20 np0005539279 nova_compute[187514]: 2025-11-29 00:52:20.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:20 np0005539279 nova_compute[187514]: 2025-11-29 00:52:20.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:21 np0005539279 nova_compute[187514]: 2025-11-29 00:52:21.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:21 np0005539279 nova_compute[187514]: 2025-11-29 00:52:21.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:52:21 np0005539279 nova_compute[187514]: 2025-11-29 00:52:21.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:52:21 np0005539279 nova_compute[187514]: 2025-11-29 00:52:21.650 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 19:52:22 np0005539279 nova_compute[187514]: 2025-11-29 00:52:22.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:52:31 np0005539279 podman[213439]: 2025-11-29 00:52:31.824183025 +0000 UTC m=+0.080651080 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:52:33 np0005539279 podman[213467]: 2025-11-29 00:52:33.286565101 +0000 UTC m=+0.077523507 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:52:34 np0005539279 podman[213487]: 2025-11-29 00:52:34.854588742 +0000 UTC m=+0.097913215 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:52:46 np0005539279 podman[213512]: 2025-11-29 00:52:46.836831982 +0000 UTC m=+0.076104963 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 19:52:46 np0005539279 podman[213511]: 2025-11-29 00:52:46.840536242 +0000 UTC m=+0.085034959 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 28 19:52:48 np0005539279 podman[213558]: 2025-11-29 00:52:48.21790306 +0000 UTC m=+0.086642897 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 28 19:52:48 np0005539279 podman[213557]: 2025-11-29 00:52:48.25306408 +0000 UTC m=+0.130250459 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 19:53:02 np0005539279 podman[213605]: 2025-11-29 00:53:02.87209937 +0000 UTC m=+0.110296023 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:53:03 np0005539279 podman[213630]: 2025-11-29 00:53:03.84677662 +0000 UTC m=+0.092801990 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 19:53:05 np0005539279 podman[213652]: 2025-11-29 00:53:05.858325945 +0000 UTC m=+0.091887803 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 19:53:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:08.082 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:08.082 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:08.082 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:16 np0005539279 nova_compute[187514]: 2025-11-29 00:53:16.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:16 np0005539279 nova_compute[187514]: 2025-11-29 00:53:16.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 19:53:16 np0005539279 nova_compute[187514]: 2025-11-29 00:53:16.633 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 19:53:16 np0005539279 nova_compute[187514]: 2025-11-29 00:53:16.634 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:16 np0005539279 nova_compute[187514]: 2025-11-29 00:53:16.634 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 19:53:16 np0005539279 nova_compute[187514]: 2025-11-29 00:53:16.650 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:17 np0005539279 nova_compute[187514]: 2025-11-29 00:53:17.662 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:17 np0005539279 nova_compute[187514]: 2025-11-29 00:53:17.662 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:53:17 np0005539279 podman[213674]: 2025-11-29 00:53:17.836330883 +0000 UTC m=+0.069378301 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Nov 28 19:53:17 np0005539279 podman[213675]: 2025-11-29 00:53:17.853713272 +0000 UTC m=+0.077536435 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.646 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.647 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.647 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.647 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:53:18 np0005539279 podman[213719]: 2025-11-29 00:53:18.876777685 +0000 UTC m=+0.117131057 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.881 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.883 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6052MB free_disk=73.37747192382812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.884 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:18 np0005539279 nova_compute[187514]: 2025-11-29 00:53:18.884 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:18 np0005539279 podman[213718]: 2025-11-29 00:53:18.885061162 +0000 UTC m=+0.127661841 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.207 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.208 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.379 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing inventories for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.474 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating ProviderTree inventory for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.475 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.492 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing aggregate associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.520 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing trait associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, traits: COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,HW_CPU_X86_SSE,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.558 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.575 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.576 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:53:19 np0005539279 nova_compute[187514]: 2025-11-29 00:53:19.576 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:20 np0005539279 nova_compute[187514]: 2025-11-29 00:53:20.576 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:20 np0005539279 nova_compute[187514]: 2025-11-29 00:53:20.604 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:20 np0005539279 nova_compute[187514]: 2025-11-29 00:53:20.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:21.438 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:53:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:21.439 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 19:53:21 np0005539279 nova_compute[187514]: 2025-11-29 00:53:21.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:21 np0005539279 nova_compute[187514]: 2025-11-29 00:53:21.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:53:21 np0005539279 nova_compute[187514]: 2025-11-29 00:53:21.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:53:21 np0005539279 nova_compute[187514]: 2025-11-29 00:53:21.646 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 19:53:21 np0005539279 nova_compute[187514]: 2025-11-29 00:53:21.648 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:23 np0005539279 nova_compute[187514]: 2025-11-29 00:53:23.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:53:25 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:25.441 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.305 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:53:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:53:33 np0005539279 podman[213771]: 2025-11-29 00:53:33.83969227 +0000 UTC m=+0.081981198 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 19:53:34 np0005539279 podman[213796]: 2025-11-29 00:53:34.855013963 +0000 UTC m=+0.095560593 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 19:53:36 np0005539279 podman[213816]: 2025-11-29 00:53:36.83245716 +0000 UTC m=+0.079730500 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 19:53:42 np0005539279 nova_compute[187514]: 2025-11-29 00:53:42.798 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "b684198c-70de-4847-95da-9b3d77da7dbb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:42 np0005539279 nova_compute[187514]: 2025-11-29 00:53:42.800 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:42 np0005539279 nova_compute[187514]: 2025-11-29 00:53:42.831 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.036 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.037 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.050 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.051 187518 INFO nova.compute.claims [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.268 187518 DEBUG nova.compute.provider_tree [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.285 187518 DEBUG nova.scheduler.client.report [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.309 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.310 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.362 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.364 187518 DEBUG nova.network.neutron [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.395 187518 INFO nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.417 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.527 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.529 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.530 187518 INFO nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Creating image(s)#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.531 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.531 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.532 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.532 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:43 np0005539279 nova_compute[187514]: 2025-11-29 00:53:43.533 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:44 np0005539279 nova_compute[187514]: 2025-11-29 00:53:44.202 187518 WARNING oslo_policy.policy [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 28 19:53:44 np0005539279 nova_compute[187514]: 2025-11-29 00:53:44.203 187518 WARNING oslo_policy.policy [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 28 19:53:44 np0005539279 nova_compute[187514]: 2025-11-29 00:53:44.207 187518 DEBUG nova.policy [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.408 187518 DEBUG nova.network.neutron [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Successfully created port: dc0d6f5b-4063-4940-975e-10b9379eb880 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.528 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.615 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f.part --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.617 187518 DEBUG nova.virt.images [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] 017f04d5-006e-46df-a06f-ac852f70dddf was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.619 187518 DEBUG nova.privsep.utils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.619 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f.part /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.802 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f.part /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f.converted" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.806 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.885 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f.converted --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.887 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:45 np0005539279 nova_compute[187514]: 2025-11-29 00:53:45.920 187518 INFO oslo.privsep.daemon [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpvuqx1fpz/privsep.sock']#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.687 187518 INFO oslo.privsep.daemon [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.511 213861 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.519 213861 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.523 213861 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.523 213861 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213861#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.792 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.818 187518 DEBUG nova.network.neutron [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Successfully updated port: dc0d6f5b-4063-4940-975e-10b9379eb880 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.849 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.850 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.850 187518 DEBUG nova.network.neutron [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.873 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.874 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.875 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.890 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.944 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.945 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.979 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.981 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:46 np0005539279 nova_compute[187514]: 2025-11-29 00:53:46.982 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.062 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.064 187518 DEBUG nova.virt.disk.api [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.064 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.138 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.140 187518 DEBUG nova.virt.disk.api [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.140 187518 DEBUG nova.objects.instance [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid b684198c-70de-4847-95da-9b3d77da7dbb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.164 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.164 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Ensure instance console log exists: /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.165 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.166 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.166 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.212 187518 DEBUG nova.network.neutron [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.350 187518 DEBUG nova.compute.manager [req-d09e3eae-e3c0-4296-9f53-ea8a35bc90ca req-c4e0657b-94b0-4041-81b6-04ea1c2efc5b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received event network-changed-dc0d6f5b-4063-4940-975e-10b9379eb880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.351 187518 DEBUG nova.compute.manager [req-d09e3eae-e3c0-4296-9f53-ea8a35bc90ca req-c4e0657b-94b0-4041-81b6-04ea1c2efc5b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Refreshing instance network info cache due to event network-changed-dc0d6f5b-4063-4940-975e-10b9379eb880. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.351 187518 DEBUG oslo_concurrency.lockutils [req-d09e3eae-e3c0-4296-9f53-ea8a35bc90ca req-c4e0657b-94b0-4041-81b6-04ea1c2efc5b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:53:47 np0005539279 nova_compute[187514]: 2025-11-29 00:53:47.990 187518 DEBUG nova.network.neutron [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updating instance_info_cache with network_info: [{"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.010 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.011 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Instance network_info: |[{"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.012 187518 DEBUG oslo_concurrency.lockutils [req-d09e3eae-e3c0-4296-9f53-ea8a35bc90ca req-c4e0657b-94b0-4041-81b6-04ea1c2efc5b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.012 187518 DEBUG nova.network.neutron [req-d09e3eae-e3c0-4296-9f53-ea8a35bc90ca req-c4e0657b-94b0-4041-81b6-04ea1c2efc5b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Refreshing network info cache for port dc0d6f5b-4063-4940-975e-10b9379eb880 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.018 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Start _get_guest_xml network_info=[{"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.024 187518 WARNING nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.029 187518 DEBUG nova.virt.libvirt.host [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.030 187518 DEBUG nova.virt.libvirt.host [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.033 187518 DEBUG nova.virt.libvirt.host [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.034 187518 DEBUG nova.virt.libvirt.host [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.035 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.035 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.038 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.038 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.039 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.039 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.039 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.040 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.040 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.041 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.041 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.041 187518 DEBUG nova.virt.hardware [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.047 187518 DEBUG nova.privsep.utils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.048 187518 DEBUG nova.virt.libvirt.vif [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:53:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1039049178',display_name='tempest-TestNetworkBasicOps-server-1039049178',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1039049178',id=1,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHctOnxtO1cZjnywvziAgspEU7SiXv/37/3xfOAey/+qXIzu7yeRWuxik3GnzwZDqAYudEb2ozpm4Jl84nvxbVOaAVyNgscfEkyUwG86RbJ/uw52uW9+STd2w/CiuqFJAQ==',key_name='tempest-TestNetworkBasicOps-860229746',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-6s9ppta1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:53:43Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=b684198c-70de-4847-95da-9b3d77da7dbb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.049 187518 DEBUG nova.network.os_vif_util [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.050 187518 DEBUG nova.network.os_vif_util [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:1a:67,bridge_name='br-int',has_traffic_filtering=True,id=dc0d6f5b-4063-4940-975e-10b9379eb880,network=Network(4c74f40e-8f35-48f0-bee4-57a35c0924f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc0d6f5b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.052 187518 DEBUG nova.objects.instance [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid b684198c-70de-4847-95da-9b3d77da7dbb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.076 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] End _get_guest_xml xml=<domain type="kvm">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <uuid>b684198c-70de-4847-95da-9b3d77da7dbb</uuid>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <name>instance-00000001</name>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-1039049178</nova:name>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 00:53:48</nova:creationTime>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        <nova:port uuid="dc0d6f5b-4063-4940-975e-10b9379eb880">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <system>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <entry name="serial">b684198c-70de-4847-95da-9b3d77da7dbb</entry>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <entry name="uuid">b684198c-70de-4847-95da-9b3d77da7dbb</entry>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </system>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <os>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  </clock>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk.config"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:5b:1a:67"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <target dev="tapdc0d6f5b-40"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/console.log" append="off"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </serial>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <video>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 19:53:48 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 19:53:48 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:53:48 np0005539279 nova_compute[187514]: </domain>
Nov 28 19:53:48 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.077 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Preparing to wait for external event network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.077 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.077 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.078 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.078 187518 DEBUG nova.virt.libvirt.vif [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:53:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1039049178',display_name='tempest-TestNetworkBasicOps-server-1039049178',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1039049178',id=1,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHctOnxtO1cZjnywvziAgspEU7SiXv/37/3xfOAey/+qXIzu7yeRWuxik3GnzwZDqAYudEb2ozpm4Jl84nvxbVOaAVyNgscfEkyUwG86RbJ/uw52uW9+STd2w/CiuqFJAQ==',key_name='tempest-TestNetworkBasicOps-860229746',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-6s9ppta1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:53:43Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=b684198c-70de-4847-95da-9b3d77da7dbb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.078 187518 DEBUG nova.network.os_vif_util [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.079 187518 DEBUG nova.network.os_vif_util [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:1a:67,bridge_name='br-int',has_traffic_filtering=True,id=dc0d6f5b-4063-4940-975e-10b9379eb880,network=Network(4c74f40e-8f35-48f0-bee4-57a35c0924f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc0d6f5b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.079 187518 DEBUG os_vif [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:1a:67,bridge_name='br-int',has_traffic_filtering=True,id=dc0d6f5b-4063-4940-975e-10b9379eb880,network=Network(4c74f40e-8f35-48f0-bee4-57a35c0924f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc0d6f5b-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.111 187518 DEBUG ovsdbapp.backend.ovs_idl [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.111 187518 DEBUG ovsdbapp.backend.ovs_idl [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.112 187518 DEBUG ovsdbapp.backend.ovs_idl [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.112 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.113 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.113 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.114 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.135 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.136 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.137 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.138 187518 INFO oslo.privsep.daemon [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmperll67vu/privsep.sock']#033[00m
Nov 28 19:53:48 np0005539279 podman[213884]: 2025-11-29 00:53:48.559407562 +0000 UTC m=+0.117980900 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350)
Nov 28 19:53:48 np0005539279 podman[213885]: 2025-11-29 00:53:48.56883813 +0000 UTC m=+0.120995299 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.895 187518 INFO oslo.privsep.daemon [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.740 213930 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.745 213930 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.747 213930 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Nov 28 19:53:48 np0005539279 nova_compute[187514]: 2025-11-29 00:53:48.747 213930 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213930#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.207 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.208 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdc0d6f5b-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.209 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdc0d6f5b-40, col_values=(('external_ids', {'iface-id': 'dc0d6f5b-4063-4940-975e-10b9379eb880', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:1a:67', 'vm-uuid': 'b684198c-70de-4847-95da-9b3d77da7dbb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.211 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:49 np0005539279 NetworkManager[55703]: <info>  [1764377629.2129] manager: (tapdc0d6f5b-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.213 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.220 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.221 187518 INFO os_vif [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:1a:67,bridge_name='br-int',has_traffic_filtering=True,id=dc0d6f5b-4063-4940-975e-10b9379eb880,network=Network(4c74f40e-8f35-48f0-bee4-57a35c0924f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc0d6f5b-40')#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.282 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.283 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.283 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:5b:1a:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:53:49 np0005539279 nova_compute[187514]: 2025-11-29 00:53:49.284 187518 INFO nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Using config drive#033[00m
Nov 28 19:53:49 np0005539279 podman[213937]: 2025-11-29 00:53:49.858980935 +0000 UTC m=+0.090595023 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Nov 28 19:53:49 np0005539279 podman[213936]: 2025-11-29 00:53:49.90560091 +0000 UTC m=+0.142884675 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.062 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.127 187518 DEBUG nova.network.neutron [req-d09e3eae-e3c0-4296-9f53-ea8a35bc90ca req-c4e0657b-94b0-4041-81b6-04ea1c2efc5b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updated VIF entry in instance network info cache for port dc0d6f5b-4063-4940-975e-10b9379eb880. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.128 187518 DEBUG nova.network.neutron [req-d09e3eae-e3c0-4296-9f53-ea8a35bc90ca req-c4e0657b-94b0-4041-81b6-04ea1c2efc5b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updating instance_info_cache with network_info: [{"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.149 187518 DEBUG oslo_concurrency.lockutils [req-d09e3eae-e3c0-4296-9f53-ea8a35bc90ca req-c4e0657b-94b0-4041-81b6-04ea1c2efc5b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.197 187518 INFO nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Creating config drive at /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk.config#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.205 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsurhbqsw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.344 187518 DEBUG oslo_concurrency.processutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsurhbqsw" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:53:52 np0005539279 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 28 19:53:52 np0005539279 NetworkManager[55703]: <info>  [1764377632.4411] manager: (tapdc0d6f5b-40): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Nov 28 19:53:52 np0005539279 kernel: tapdc0d6f5b-40: entered promiscuous mode
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.443 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:52 np0005539279 ovn_controller[95686]: 2025-11-29T00:53:52Z|00027|binding|INFO|Claiming lport dc0d6f5b-4063-4940-975e-10b9379eb880 for this chassis.
Nov 28 19:53:52 np0005539279 ovn_controller[95686]: 2025-11-29T00:53:52Z|00028|binding|INFO|dc0d6f5b-4063-4940-975e-10b9379eb880: Claiming fa:16:3e:5b:1a:67 10.100.0.9
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.451 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:52.477 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:1a:67 10.100.0.9'], port_security=['fa:16:3e:5b:1a:67 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b684198c-70de-4847-95da-9b3d77da7dbb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c74f40e-8f35-48f0-bee4-57a35c0924f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ddd9cfd4-609a-4c6d-8c17-5dc3fb4f927f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=127f73d2-2151-4a8e-987c-7618da8ab21d, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=dc0d6f5b-4063-4940-975e-10b9379eb880) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:53:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:52.479 104584 INFO neutron.agent.ovn.metadata.agent [-] Port dc0d6f5b-4063-4940-975e-10b9379eb880 in datapath 4c74f40e-8f35-48f0-bee4-57a35c0924f2 bound to our chassis#033[00m
Nov 28 19:53:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:52.482 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4c74f40e-8f35-48f0-bee4-57a35c0924f2#033[00m
Nov 28 19:53:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:52.485 104584 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmprl13ntcj/privsep.sock']#033[00m
Nov 28 19:53:52 np0005539279 systemd-udevd[214001]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:53:52 np0005539279 NetworkManager[55703]: <info>  [1764377632.5281] device (tapdc0d6f5b-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:53:52 np0005539279 NetworkManager[55703]: <info>  [1764377632.5295] device (tapdc0d6f5b-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:53:52 np0005539279 systemd-machined[153752]: New machine qemu-1-instance-00000001.
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.564 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:52 np0005539279 ovn_controller[95686]: 2025-11-29T00:53:52Z|00029|binding|INFO|Setting lport dc0d6f5b-4063-4940-975e-10b9379eb880 ovn-installed in OVS
Nov 28 19:53:52 np0005539279 ovn_controller[95686]: 2025-11-29T00:53:52Z|00030|binding|INFO|Setting lport dc0d6f5b-4063-4940-975e-10b9379eb880 up in Southbound
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.570 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:52 np0005539279 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.872 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377632.8718472, b684198c-70de-4847-95da-9b3d77da7dbb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.873 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] VM Started (Lifecycle Event)#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.929 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.934 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377632.8720899, b684198c-70de-4847-95da-9b3d77da7dbb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.935 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] VM Paused (Lifecycle Event)#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.965 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.969 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:53:52 np0005539279 nova_compute[187514]: 2025-11-29 00:53:52.996 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.165 104584 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.166 104584 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmprl13ntcj/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.046 214026 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.050 214026 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.053 214026 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.053 214026 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214026#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.170 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[32588e87-5289-4be9-b8f4-8bb4590ef023]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.435 187518 DEBUG nova.compute.manager [req-d95b5007-c70f-4af3-aa51-5ab71e15390d req-2b32deea-88d6-42b1-a8c8-62f8d2dcd5f5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received event network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.436 187518 DEBUG oslo_concurrency.lockutils [req-d95b5007-c70f-4af3-aa51-5ab71e15390d req-2b32deea-88d6-42b1-a8c8-62f8d2dcd5f5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.437 187518 DEBUG oslo_concurrency.lockutils [req-d95b5007-c70f-4af3-aa51-5ab71e15390d req-2b32deea-88d6-42b1-a8c8-62f8d2dcd5f5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.437 187518 DEBUG oslo_concurrency.lockutils [req-d95b5007-c70f-4af3-aa51-5ab71e15390d req-2b32deea-88d6-42b1-a8c8-62f8d2dcd5f5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.437 187518 DEBUG nova.compute.manager [req-d95b5007-c70f-4af3-aa51-5ab71e15390d req-2b32deea-88d6-42b1-a8c8-62f8d2dcd5f5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Processing event network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.439 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.444 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.445 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377633.443889, b684198c-70de-4847-95da-9b3d77da7dbb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.445 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] VM Resumed (Lifecycle Event)#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.454 187518 INFO nova.virt.libvirt.driver [-] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Instance spawned successfully.#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.454 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.489 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.494 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.528 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.537 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.538 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.538 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.538 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.539 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.539 187518 DEBUG nova.virt.libvirt.driver [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.611 187518 INFO nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Took 10.08 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.613 187518 DEBUG nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.680 187518 INFO nova.compute.manager [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Took 10.69 seconds to build instance.#033[00m
Nov 28 19:53:53 np0005539279 nova_compute[187514]: 2025-11-29 00:53:53.698 187518 DEBUG oslo_concurrency.lockutils [None req-cfc7b434-ff5f-43b0-89a1-905d027da0e3 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.719 214026 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.719 214026 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:53.720 214026 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:54 np0005539279 nova_compute[187514]: 2025-11-29 00:53:54.212 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.272 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[a690e75a-b472-49bf-b2bd-49891570eacc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.273 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4c74f40e-81 in ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.275 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4c74f40e-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.276 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[93dbc14b-6418-48c9-8e32-2ef29572b333]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.279 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[f66c0177-904e-44d7-8775-bcae4c61ffca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.318 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c7e916-899e-4dac-b5a9-6fb1f376586f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.348 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c6a64734-be13-4f88-ab83-568f1f373e3e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.350 104584 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmps5c5loj5/privsep.sock']#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.989 104584 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.991 104584 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmps5c5loj5/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.893 214042 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.902 214042 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.906 214042 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.907 214042 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214042#033[00m
Nov 28 19:53:54 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:54.995 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[009acb35-6e02-4591-bcef-a2814944ab6b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:55 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:55.439 214042 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:55 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:55.439 214042 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:55 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:55.440 214042 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:55 np0005539279 nova_compute[187514]: 2025-11-29 00:53:55.595 187518 DEBUG nova.compute.manager [req-e76540e5-22f6-43f2-aa04-da4cfaf10f71 req-13d9b028-0e72-4b35-8a4b-e1caca832863 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received event network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:53:55 np0005539279 nova_compute[187514]: 2025-11-29 00:53:55.599 187518 DEBUG oslo_concurrency.lockutils [req-e76540e5-22f6-43f2-aa04-da4cfaf10f71 req-13d9b028-0e72-4b35-8a4b-e1caca832863 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:53:55 np0005539279 nova_compute[187514]: 2025-11-29 00:53:55.600 187518 DEBUG oslo_concurrency.lockutils [req-e76540e5-22f6-43f2-aa04-da4cfaf10f71 req-13d9b028-0e72-4b35-8a4b-e1caca832863 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:53:55 np0005539279 nova_compute[187514]: 2025-11-29 00:53:55.600 187518 DEBUG oslo_concurrency.lockutils [req-e76540e5-22f6-43f2-aa04-da4cfaf10f71 req-13d9b028-0e72-4b35-8a4b-e1caca832863 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:53:55 np0005539279 nova_compute[187514]: 2025-11-29 00:53:55.601 187518 DEBUG nova.compute.manager [req-e76540e5-22f6-43f2-aa04-da4cfaf10f71 req-13d9b028-0e72-4b35-8a4b-e1caca832863 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] No waiting events found dispatching network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:53:55 np0005539279 nova_compute[187514]: 2025-11-29 00:53:55.602 187518 WARNING nova.compute.manager [req-e76540e5-22f6-43f2-aa04-da4cfaf10f71 req-13d9b028-0e72-4b35-8a4b-e1caca832863 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received unexpected event network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:53:55 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:55.996 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[f00921b9-e260-4927-81a6-b39edc508567]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.023 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[ede25b5a-59f5-4ff2-b2f3-6666fc929583]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 NetworkManager[55703]: <info>  [1764377636.0264] manager: (tap4c74f40e-80): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.060 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[ee25e5ee-c74c-4ab3-9aeb-627b1b77a73e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 systemd-udevd[214056]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.066 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[b852cb0e-1769-4597-98b2-18b4f8f2e1f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 NetworkManager[55703]: <info>  [1764377636.0923] device (tap4c74f40e-80): carrier: link connected
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.098 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[9c796492-c410-43d6-b4c0-36a110963eb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.122 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b5d16261-df2a-4621-ba77-eaa0def59161]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c74f40e-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:cb:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356979, 'reachable_time': 42400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214074, 'error': None, 'target': 'ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.139 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0002b5b1-58fe-4d93-988c-3b8410e25aef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1e:cb93'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 356979, 'tstamp': 356979}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214075, 'error': None, 'target': 'ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.156 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[f968ef9e-f18d-487d-9e59-d69d84fe70b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4c74f40e-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:cb:93'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356979, 'reachable_time': 42400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214076, 'error': None, 'target': 'ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.194 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[688626d2-bf2f-42e8-9db8-5cc778d13b7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.270 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[6e91ff2a-94b8-42ed-bf35-b75ee67b8d6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.273 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c74f40e-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.274 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.275 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c74f40e-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:53:56 np0005539279 NetworkManager[55703]: <info>  [1764377636.2791] manager: (tap4c74f40e-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 28 19:53:56 np0005539279 nova_compute[187514]: 2025-11-29 00:53:56.278 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:56 np0005539279 kernel: tap4c74f40e-80: entered promiscuous mode
Nov 28 19:53:56 np0005539279 nova_compute[187514]: 2025-11-29 00:53:56.282 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.283 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4c74f40e-80, col_values=(('external_ids', {'iface-id': 'f9056d3b-5257-40f1-ae67-c7beec09428a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:53:56 np0005539279 ovn_controller[95686]: 2025-11-29T00:53:56Z|00031|binding|INFO|Releasing lport f9056d3b-5257-40f1-ae67-c7beec09428a from this chassis (sb_readonly=0)
Nov 28 19:53:56 np0005539279 nova_compute[187514]: 2025-11-29 00:53:56.285 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:56 np0005539279 nova_compute[187514]: 2025-11-29 00:53:56.310 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.311 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4c74f40e-8f35-48f0-bee4-57a35c0924f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4c74f40e-8f35-48f0-bee4-57a35c0924f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.312 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2525ba42-4084-4be5-b311-60099934bdee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.314 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-4c74f40e-8f35-48f0-bee4-57a35c0924f2
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/4c74f40e-8f35-48f0-bee4-57a35c0924f2.pid.haproxy
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 4c74f40e-8f35-48f0-bee4-57a35c0924f2
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 19:53:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:53:56.316 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2', 'env', 'PROCESS_TAG=haproxy-4c74f40e-8f35-48f0-bee4-57a35c0924f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4c74f40e-8f35-48f0-bee4-57a35c0924f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 19:53:56 np0005539279 podman[214109]: 2025-11-29 00:53:56.816073258 +0000 UTC m=+0.092450498 container create f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 19:53:56 np0005539279 podman[214109]: 2025-11-29 00:53:56.767792644 +0000 UTC m=+0.044169934 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:53:56 np0005539279 systemd[1]: Started libpod-conmon-f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9.scope.
Nov 28 19:53:56 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:53:56 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3fde8772c750be15c64c8fd727b2cb41f7bd0e1dbdb03209ec2485b98eaabb82/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 19:53:56 np0005539279 podman[214109]: 2025-11-29 00:53:56.945300418 +0000 UTC m=+0.221677708 container init f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 19:53:56 np0005539279 podman[214109]: 2025-11-29 00:53:56.95553933 +0000 UTC m=+0.231916570 container start f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:53:57 np0005539279 neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2[214124]: [NOTICE]   (214128) : New worker (214130) forked
Nov 28 19:53:57 np0005539279 neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2[214124]: [NOTICE]   (214128) : Loading success.
Nov 28 19:53:57 np0005539279 nova_compute[187514]: 2025-11-29 00:53:57.066 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.240 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:59 np0005539279 ovn_controller[95686]: 2025-11-29T00:53:59Z|00032|binding|INFO|Releasing lport f9056d3b-5257-40f1-ae67-c7beec09428a from this chassis (sb_readonly=0)
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.345 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:59 np0005539279 NetworkManager[55703]: <info>  [1764377639.3467] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Nov 28 19:53:59 np0005539279 NetworkManager[55703]: <info>  [1764377639.3477] device (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:53:59 np0005539279 NetworkManager[55703]: <info>  [1764377639.3493] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Nov 28 19:53:59 np0005539279 NetworkManager[55703]: <info>  [1764377639.3538] device (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 19:53:59 np0005539279 NetworkManager[55703]: <info>  [1764377639.3552] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 28 19:53:59 np0005539279 NetworkManager[55703]: <info>  [1764377639.3562] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Nov 28 19:53:59 np0005539279 NetworkManager[55703]: <info>  [1764377639.3567] device (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 28 19:53:59 np0005539279 NetworkManager[55703]: <info>  [1764377639.3570] device (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 28 19:53:59 np0005539279 ovn_controller[95686]: 2025-11-29T00:53:59Z|00033|binding|INFO|Releasing lport f9056d3b-5257-40f1-ae67-c7beec09428a from this chassis (sb_readonly=0)
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.377 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.383 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.559 187518 DEBUG nova.compute.manager [req-0965ddab-de7c-4718-abd4-8afa0a097ac0 req-05510201-b277-42d4-a3e3-97eecfe1f2fb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received event network-changed-dc0d6f5b-4063-4940-975e-10b9379eb880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.560 187518 DEBUG nova.compute.manager [req-0965ddab-de7c-4718-abd4-8afa0a097ac0 req-05510201-b277-42d4-a3e3-97eecfe1f2fb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Refreshing instance network info cache due to event network-changed-dc0d6f5b-4063-4940-975e-10b9379eb880. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.561 187518 DEBUG oslo_concurrency.lockutils [req-0965ddab-de7c-4718-abd4-8afa0a097ac0 req-05510201-b277-42d4-a3e3-97eecfe1f2fb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.561 187518 DEBUG oslo_concurrency.lockutils [req-0965ddab-de7c-4718-abd4-8afa0a097ac0 req-05510201-b277-42d4-a3e3-97eecfe1f2fb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:53:59 np0005539279 nova_compute[187514]: 2025-11-29 00:53:59.561 187518 DEBUG nova.network.neutron [req-0965ddab-de7c-4718-abd4-8afa0a097ac0 req-05510201-b277-42d4-a3e3-97eecfe1f2fb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Refreshing network info cache for port dc0d6f5b-4063-4940-975e-10b9379eb880 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:54:01 np0005539279 nova_compute[187514]: 2025-11-29 00:54:01.403 187518 DEBUG nova.network.neutron [req-0965ddab-de7c-4718-abd4-8afa0a097ac0 req-05510201-b277-42d4-a3e3-97eecfe1f2fb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updated VIF entry in instance network info cache for port dc0d6f5b-4063-4940-975e-10b9379eb880. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:54:01 np0005539279 nova_compute[187514]: 2025-11-29 00:54:01.405 187518 DEBUG nova.network.neutron [req-0965ddab-de7c-4718-abd4-8afa0a097ac0 req-05510201-b277-42d4-a3e3-97eecfe1f2fb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updating instance_info_cache with network_info: [{"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:54:01 np0005539279 nova_compute[187514]: 2025-11-29 00:54:01.429 187518 DEBUG oslo_concurrency.lockutils [req-0965ddab-de7c-4718-abd4-8afa0a097ac0 req-05510201-b277-42d4-a3e3-97eecfe1f2fb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:54:02 np0005539279 nova_compute[187514]: 2025-11-29 00:54:02.069 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:04 np0005539279 nova_compute[187514]: 2025-11-29 00:54:04.270 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:04Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:1a:67 10.100.0.9
Nov 28 19:54:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:04Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:1a:67 10.100.0.9
Nov 28 19:54:04 np0005539279 podman[214159]: 2025-11-29 00:54:04.865674315 +0000 UTC m=+0.093227519 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 19:54:05 np0005539279 podman[214183]: 2025-11-29 00:54:05.008977751 +0000 UTC m=+0.100511595 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Nov 28 19:54:07 np0005539279 nova_compute[187514]: 2025-11-29 00:54:07.072 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:07 np0005539279 podman[214207]: 2025-11-29 00:54:07.861832758 +0000 UTC m=+0.095959171 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:54:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:08.087 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:08.089 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:08.089 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:09 np0005539279 nova_compute[187514]: 2025-11-29 00:54:09.314 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:10 np0005539279 nova_compute[187514]: 2025-11-29 00:54:10.002 187518 INFO nova.compute.manager [None req-10acbfa0-5bd5-4439-98e7-3661de64b835 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Get console output#033[00m
Nov 28 19:54:10 np0005539279 nova_compute[187514]: 2025-11-29 00:54:10.111 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 19:54:12 np0005539279 nova_compute[187514]: 2025-11-29 00:54:12.077 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:14 np0005539279 nova_compute[187514]: 2025-11-29 00:54:14.318 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:17 np0005539279 nova_compute[187514]: 2025-11-29 00:54:17.078 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:18 np0005539279 podman[214230]: 2025-11-29 00:54:18.846280788 +0000 UTC m=+0.081207136 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:54:18 np0005539279 podman[214229]: 2025-11-29 00:54:18.853274974 +0000 UTC m=+0.088160630 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container)
Nov 28 19:54:19 np0005539279 nova_compute[187514]: 2025-11-29 00:54:19.355 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:19 np0005539279 nova_compute[187514]: 2025-11-29 00:54:19.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:19 np0005539279 nova_compute[187514]: 2025-11-29 00:54:19.636 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:19 np0005539279 nova_compute[187514]: 2025-11-29 00:54:19.636 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:19 np0005539279 nova_compute[187514]: 2025-11-29 00:54:19.637 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.614 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.616 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.655 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.656 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.656 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.657 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.807 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:54:20 np0005539279 podman[214275]: 2025-11-29 00:54:20.849068476 +0000 UTC m=+0.101713440 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 19:54:20 np0005539279 podman[214274]: 2025-11-29 00:54:20.887442738 +0000 UTC m=+0.137367862 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.902 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.903 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:54:20 np0005539279 nova_compute[187514]: 2025-11-29 00:54:20.985 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.221 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.224 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=73.31453323364258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.224 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.225 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.325 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Instance b684198c-70de-4847-95da-9b3d77da7dbb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.326 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.326 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.381 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.430 187518 ERROR nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [req-36929042-eced-4ab2-8058-697298fd1e64] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 15673c9a-eee0-47b4-b3d3-728a0fedb147.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-36929042-eced-4ab2-8058-697298fd1e64"}]}#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.454 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing inventories for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.477 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating ProviderTree inventory for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.478 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.516 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing aggregate associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.549 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing trait associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, traits: COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,HW_CPU_X86_SSE,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.611 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.680 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updated inventory for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.681 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.681 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.710 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:54:21 np0005539279 nova_compute[187514]: 2025-11-29 00:54:21.711 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:22 np0005539279 nova_compute[187514]: 2025-11-29 00:54:22.081 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:22 np0005539279 nova_compute[187514]: 2025-11-29 00:54:22.703 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:22 np0005539279 nova_compute[187514]: 2025-11-29 00:54:22.704 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:22 np0005539279 nova_compute[187514]: 2025-11-29 00:54:22.704 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:54:22 np0005539279 nova_compute[187514]: 2025-11-29 00:54:22.705 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.270 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.271 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquired lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.271 187518 DEBUG nova.network.neutron [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.271 187518 DEBUG nova.objects.instance [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b684198c-70de-4847-95da-9b3d77da7dbb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.652 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.653 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.674 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.779 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.780 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.789 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.790 187518 INFO nova.compute.claims [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.942 187518 DEBUG nova.compute.provider_tree [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:54:23 np0005539279 nova_compute[187514]: 2025-11-29 00:54:23.964 187518 DEBUG nova.scheduler.client.report [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.004 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.006 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.081 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.083 187518 DEBUG nova.network.neutron [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.107 187518 INFO nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.135 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.241 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.243 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.243 187518 INFO nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Creating image(s)#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.244 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.245 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.246 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.270 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.304 187518 DEBUG nova.policy [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.355 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.357 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.358 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.382 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.403 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.464 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.465 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.502 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.504 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.505 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.591 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.592 187518 DEBUG nova.virt.disk.api [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.593 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.649 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.650 187518 DEBUG nova.virt.disk.api [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.651 187518 DEBUG nova.objects.instance [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid eb1fea54-15cd-4b1e-b337-a003cebd10a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.668 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.669 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Ensure instance console log exists: /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.670 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.670 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:24 np0005539279 nova_compute[187514]: 2025-11-29 00:54:24.671 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:25 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:25.541 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:54:25 np0005539279 nova_compute[187514]: 2025-11-29 00:54:25.543 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:25 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:25.543 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 19:54:25 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:25.545 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:25 np0005539279 nova_compute[187514]: 2025-11-29 00:54:25.595 187518 DEBUG nova.network.neutron [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updating instance_info_cache with network_info: [{"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:54:25 np0005539279 nova_compute[187514]: 2025-11-29 00:54:25.620 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Releasing lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:54:25 np0005539279 nova_compute[187514]: 2025-11-29 00:54:25.621 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 19:54:25 np0005539279 nova_compute[187514]: 2025-11-29 00:54:25.622 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:25 np0005539279 nova_compute[187514]: 2025-11-29 00:54:25.623 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:25 np0005539279 nova_compute[187514]: 2025-11-29 00:54:25.624 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:54:25 np0005539279 nova_compute[187514]: 2025-11-29 00:54:25.644 187518 DEBUG nova.network.neutron [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Successfully created port: cc367edf-e84b-4282-920c-6ef77203ac87 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:54:26 np0005539279 nova_compute[187514]: 2025-11-29 00:54:26.854 187518 DEBUG nova.network.neutron [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Successfully updated port: cc367edf-e84b-4282-920c-6ef77203ac87 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:54:26 np0005539279 nova_compute[187514]: 2025-11-29 00:54:26.880 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-eb1fea54-15cd-4b1e-b337-a003cebd10a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:54:26 np0005539279 nova_compute[187514]: 2025-11-29 00:54:26.880 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-eb1fea54-15cd-4b1e-b337-a003cebd10a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:54:26 np0005539279 nova_compute[187514]: 2025-11-29 00:54:26.881 187518 DEBUG nova.network.neutron [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:54:26 np0005539279 nova_compute[187514]: 2025-11-29 00:54:26.966 187518 DEBUG nova.compute.manager [req-8adf2231-8de0-4d61-a377-20f6501121b4 req-c0531f3c-f0b2-4ef4-ba80-8d42eb3cca42 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received event network-changed-cc367edf-e84b-4282-920c-6ef77203ac87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:26 np0005539279 nova_compute[187514]: 2025-11-29 00:54:26.966 187518 DEBUG nova.compute.manager [req-8adf2231-8de0-4d61-a377-20f6501121b4 req-c0531f3c-f0b2-4ef4-ba80-8d42eb3cca42 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Refreshing instance network info cache due to event network-changed-cc367edf-e84b-4282-920c-6ef77203ac87. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:54:26 np0005539279 nova_compute[187514]: 2025-11-29 00:54:26.967 187518 DEBUG oslo_concurrency.lockutils [req-8adf2231-8de0-4d61-a377-20f6501121b4 req-c0531f3c-f0b2-4ef4-ba80-8d42eb3cca42 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-eb1fea54-15cd-4b1e-b337-a003cebd10a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:54:27 np0005539279 nova_compute[187514]: 2025-11-29 00:54:27.083 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:27 np0005539279 nova_compute[187514]: 2025-11-29 00:54:27.257 187518 DEBUG nova.network.neutron [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.292 187518 DEBUG nova.network.neutron [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Updating instance_info_cache with network_info: [{"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.320 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-eb1fea54-15cd-4b1e-b337-a003cebd10a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.320 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Instance network_info: |[{"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.321 187518 DEBUG oslo_concurrency.lockutils [req-8adf2231-8de0-4d61-a377-20f6501121b4 req-c0531f3c-f0b2-4ef4-ba80-8d42eb3cca42 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-eb1fea54-15cd-4b1e-b337-a003cebd10a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.322 187518 DEBUG nova.network.neutron [req-8adf2231-8de0-4d61-a377-20f6501121b4 req-c0531f3c-f0b2-4ef4-ba80-8d42eb3cca42 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Refreshing network info cache for port cc367edf-e84b-4282-920c-6ef77203ac87 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.327 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Start _get_guest_xml network_info=[{"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.335 187518 WARNING nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.341 187518 DEBUG nova.virt.libvirt.host [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.342 187518 DEBUG nova.virt.libvirt.host [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.346 187518 DEBUG nova.virt.libvirt.host [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.347 187518 DEBUG nova.virt.libvirt.host [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.348 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.348 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.349 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.350 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.350 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.350 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.351 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.351 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.352 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.352 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.352 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.353 187518 DEBUG nova.virt.hardware [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.359 187518 DEBUG nova.virt.libvirt.vif [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:54:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-254322316',display_name='tempest-TestNetworkBasicOps-server-254322316',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-254322316',id=2,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBdva0q0s5ceda9a59M3GtwSIX0mJW7ry7y/WBh0RHasahghYL++H1QKD6ae6SxJlCZal9NqntPDy/ljOW143Zo/oyIADcOYf1BKZ9oTF+d4DUTLtujKQm/qmDGp6Td5qw==',key_name='tempest-TestNetworkBasicOps-787867870',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-1hzc0edf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:54:24Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=eb1fea54-15cd-4b1e-b337-a003cebd10a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.359 187518 DEBUG nova.network.os_vif_util [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.360 187518 DEBUG nova.network.os_vif_util [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:c8,bridge_name='br-int',has_traffic_filtering=True,id=cc367edf-e84b-4282-920c-6ef77203ac87,network=Network(925da1c0-dca3-4483-abed-7d991383f88a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc367edf-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.362 187518 DEBUG nova.objects.instance [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid eb1fea54-15cd-4b1e-b337-a003cebd10a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.381 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] End _get_guest_xml xml=<domain type="kvm">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <uuid>eb1fea54-15cd-4b1e-b337-a003cebd10a9</uuid>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <name>instance-00000002</name>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-254322316</nova:name>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 00:54:28</nova:creationTime>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        <nova:port uuid="cc367edf-e84b-4282-920c-6ef77203ac87">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <system>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <entry name="serial">eb1fea54-15cd-4b1e-b337-a003cebd10a9</entry>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <entry name="uuid">eb1fea54-15cd-4b1e-b337-a003cebd10a9</entry>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </system>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <os>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  </clock>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk.config"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:52:92:c8"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <target dev="tapcc367edf-e8"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/console.log" append="off"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </serial>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <video>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 19:54:28 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 19:54:28 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:54:28 np0005539279 nova_compute[187514]: </domain>
Nov 28 19:54:28 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.383 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Preparing to wait for external event network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.384 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.384 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.385 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.386 187518 DEBUG nova.virt.libvirt.vif [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:54:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-254322316',display_name='tempest-TestNetworkBasicOps-server-254322316',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-254322316',id=2,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBdva0q0s5ceda9a59M3GtwSIX0mJW7ry7y/WBh0RHasahghYL++H1QKD6ae6SxJlCZal9NqntPDy/ljOW143Zo/oyIADcOYf1BKZ9oTF+d4DUTLtujKQm/qmDGp6Td5qw==',key_name='tempest-TestNetworkBasicOps-787867870',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-1hzc0edf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:54:24Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=eb1fea54-15cd-4b1e-b337-a003cebd10a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.386 187518 DEBUG nova.network.os_vif_util [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.388 187518 DEBUG nova.network.os_vif_util [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:c8,bridge_name='br-int',has_traffic_filtering=True,id=cc367edf-e84b-4282-920c-6ef77203ac87,network=Network(925da1c0-dca3-4483-abed-7d991383f88a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc367edf-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.388 187518 DEBUG os_vif [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:c8,bridge_name='br-int',has_traffic_filtering=True,id=cc367edf-e84b-4282-920c-6ef77203ac87,network=Network(925da1c0-dca3-4483-abed-7d991383f88a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc367edf-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.389 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.390 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.391 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.396 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.396 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc367edf-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.397 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcc367edf-e8, col_values=(('external_ids', {'iface-id': 'cc367edf-e84b-4282-920c-6ef77203ac87', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:92:c8', 'vm-uuid': 'eb1fea54-15cd-4b1e-b337-a003cebd10a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.400 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:28 np0005539279 NetworkManager[55703]: <info>  [1764377668.4015] manager: (tapcc367edf-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.404 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.410 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.412 187518 INFO os_vif [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:c8,bridge_name='br-int',has_traffic_filtering=True,id=cc367edf-e84b-4282-920c-6ef77203ac87,network=Network(925da1c0-dca3-4483-abed-7d991383f88a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc367edf-e8')#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.481 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.481 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.481 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:52:92:c8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.482 187518 INFO nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Using config drive#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.801 187518 INFO nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Creating config drive at /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk.config#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.812 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplvqzsh7m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:54:28 np0005539279 nova_compute[187514]: 2025-11-29 00:54:28.951 187518 DEBUG oslo_concurrency.processutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplvqzsh7m" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:54:29 np0005539279 kernel: tapcc367edf-e8: entered promiscuous mode
Nov 28 19:54:29 np0005539279 NetworkManager[55703]: <info>  [1764377669.0653] manager: (tapcc367edf-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Nov 28 19:54:29 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:29Z|00034|binding|INFO|Claiming lport cc367edf-e84b-4282-920c-6ef77203ac87 for this chassis.
Nov 28 19:54:29 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:29Z|00035|binding|INFO|cc367edf-e84b-4282-920c-6ef77203ac87: Claiming fa:16:3e:52:92:c8 10.100.0.24
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.066 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.076 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:92:c8 10.100.0.24'], port_security=['fa:16:3e:52:92:c8 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'eb1fea54-15cd-4b1e-b337-a003cebd10a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-925da1c0-dca3-4483-abed-7d991383f88a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4760079f-c153-4c1c-8ecc-1f5684c79bde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92c45eae-62b1-4f3e-9274-94124b9489e2, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=cc367edf-e84b-4282-920c-6ef77203ac87) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.078 104584 INFO neutron.agent.ovn.metadata.agent [-] Port cc367edf-e84b-4282-920c-6ef77203ac87 in datapath 925da1c0-dca3-4483-abed-7d991383f88a bound to our chassis#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.080 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 925da1c0-dca3-4483-abed-7d991383f88a#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.093 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[036bca2f-0ef4-4c03-93c4-e1538ba12588]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.094 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap925da1c0-d1 in ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.096 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap925da1c0-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.096 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c00b2bfb-6d58-4dfe-9555-ac47f052486f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.098 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd1f8a7-1544-4011-a509-d8af3eb1b8b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 systemd-udevd[214363]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:54:29 np0005539279 systemd-machined[153752]: New machine qemu-2-instance-00000002.
Nov 28 19:54:29 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:29Z|00036|binding|INFO|Setting lport cc367edf-e84b-4282-920c-6ef77203ac87 ovn-installed in OVS
Nov 28 19:54:29 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:29Z|00037|binding|INFO|Setting lport cc367edf-e84b-4282-920c-6ef77203ac87 up in Southbound
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.110 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:29 np0005539279 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Nov 28 19:54:29 np0005539279 NetworkManager[55703]: <info>  [1764377669.1259] device (tapcc367edf-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:54:29 np0005539279 NetworkManager[55703]: <info>  [1764377669.1272] device (tapcc367edf-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.127 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[a5276283-70be-453f-b243-b9056a07358f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.155 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[a246573a-5735-4690-8b8e-45545ddb7426]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.185 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[b82067b4-917d-461e-839a-06aa9cb58ea3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.191 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[53a767de-db33-4855-aa90-840a9357f731]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 systemd-udevd[214367]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:54:29 np0005539279 NetworkManager[55703]: <info>  [1764377669.1934] manager: (tap925da1c0-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.232 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[44e0f3ff-bf52-4b93-87e4-a0ea16b20280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.237 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[6daa49e1-36e4-4d15-9615-2fbae91a3919]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 NetworkManager[55703]: <info>  [1764377669.2689] device (tap925da1c0-d0): carrier: link connected
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.281 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9c91f1-0d5d-447e-bdc6-47273de9dbba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.306 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[3d1e6311-0325-4095-a1c9-443d36bafa51]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap925da1c0-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:e4:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360297, 'reachable_time': 18604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214397, 'error': None, 'target': 'ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.327 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[26d6458d-846a-42d1-802b-151228cf8a74]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:e4e6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360297, 'tstamp': 360297}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214402, 'error': None, 'target': 'ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.349 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e95e8019-52d2-4170-b274-f0b7ba8f0745]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap925da1c0-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:e4:e6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360297, 'reachable_time': 18604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214404, 'error': None, 'target': 'ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.383 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8a89fd-9d89-43e0-a0a1-a75eb107fda8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.412 187518 DEBUG nova.compute.manager [req-5cbcda69-48b5-4528-857d-de058fb896a0 req-94d1a236-62a8-487f-99a9-d8518c1698e6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received event network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.413 187518 DEBUG oslo_concurrency.lockutils [req-5cbcda69-48b5-4528-857d-de058fb896a0 req-94d1a236-62a8-487f-99a9-d8518c1698e6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.414 187518 DEBUG oslo_concurrency.lockutils [req-5cbcda69-48b5-4528-857d-de058fb896a0 req-94d1a236-62a8-487f-99a9-d8518c1698e6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.414 187518 DEBUG oslo_concurrency.lockutils [req-5cbcda69-48b5-4528-857d-de058fb896a0 req-94d1a236-62a8-487f-99a9-d8518c1698e6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.414 187518 DEBUG nova.compute.manager [req-5cbcda69-48b5-4528-857d-de058fb896a0 req-94d1a236-62a8-487f-99a9-d8518c1698e6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Processing event network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.418 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377669.4182966, eb1fea54-15cd-4b1e-b337-a003cebd10a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.418 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] VM Started (Lifecycle Event)#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.421 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.425 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.431 187518 INFO nova.virt.libvirt.driver [-] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Instance spawned successfully.#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.432 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.436 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.439 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.449 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.449 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.449 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.450 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.450 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.450 187518 DEBUG nova.virt.libvirt.driver [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.455 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.455 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377669.4211195, eb1fea54-15cd-4b1e-b337-a003cebd10a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.455 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] VM Paused (Lifecycle Event)#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.462 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[7300327d-a724-4f28-8aff-331ad673e67a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.464 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap925da1c0-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.464 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.465 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap925da1c0-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:29 np0005539279 kernel: tap925da1c0-d0: entered promiscuous mode
Nov 28 19:54:29 np0005539279 NetworkManager[55703]: <info>  [1764377669.4680] manager: (tap925da1c0-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.471 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap925da1c0-d0, col_values=(('external_ids', {'iface-id': '47dcbe9f-9647-4e06-8d7b-0038e51775b4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:29 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:29Z|00038|binding|INFO|Releasing lport 47dcbe9f-9647-4e06-8d7b-0038e51775b4 from this chassis (sb_readonly=0)
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.475 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/925da1c0-dca3-4483-abed-7d991383f88a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/925da1c0-dca3-4483-abed-7d991383f88a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.477 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2879681a-4beb-47db-8d08-aefc7f1a307e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.480 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-925da1c0-dca3-4483-abed-7d991383f88a
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/925da1c0-dca3-4483-abed-7d991383f88a.pid.haproxy
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 925da1c0-dca3-4483-abed-7d991383f88a
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 19:54:29 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:29.481 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a', 'env', 'PROCESS_TAG=haproxy-925da1c0-dca3-4483-abed-7d991383f88a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/925da1c0-dca3-4483-abed-7d991383f88a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.482 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.503 187518 INFO nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Took 5.26 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.504 187518 DEBUG nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.515 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.519 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377669.4252274, eb1fea54-15cd-4b1e-b337-a003cebd10a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.520 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] VM Resumed (Lifecycle Event)#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.546 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.550 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.568 187518 INFO nova.compute.manager [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Took 5.82 seconds to build instance.#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.600 187518 DEBUG oslo_concurrency.lockutils [None req-6818a07f-8792-4ad4-bb70-687ae2d444e5 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.664 187518 DEBUG nova.network.neutron [req-8adf2231-8de0-4d61-a377-20f6501121b4 req-c0531f3c-f0b2-4ef4-ba80-8d42eb3cca42 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Updated VIF entry in instance network info cache for port cc367edf-e84b-4282-920c-6ef77203ac87. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.665 187518 DEBUG nova.network.neutron [req-8adf2231-8de0-4d61-a377-20f6501121b4 req-c0531f3c-f0b2-4ef4-ba80-8d42eb3cca42 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Updating instance_info_cache with network_info: [{"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:54:29 np0005539279 nova_compute[187514]: 2025-11-29 00:54:29.682 187518 DEBUG oslo_concurrency.lockutils [req-8adf2231-8de0-4d61-a377-20f6501121b4 req-c0531f3c-f0b2-4ef4-ba80-8d42eb3cca42 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-eb1fea54-15cd-4b1e-b337-a003cebd10a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:54:29 np0005539279 podman[214437]: 2025-11-29 00:54:29.964769134 +0000 UTC m=+0.102930067 container create 58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 19:54:29 np0005539279 podman[214437]: 2025-11-29 00:54:29.905217088 +0000 UTC m=+0.043378061 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:54:30 np0005539279 systemd[1]: Started libpod-conmon-58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631.scope.
Nov 28 19:54:30 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:54:30 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/616552baf2a84cdcc977c25e4b70e57bf20ac7fad1528893e75e8580141af502/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 19:54:30 np0005539279 podman[214437]: 2025-11-29 00:54:30.08706371 +0000 UTC m=+0.225224673 container init 58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 19:54:30 np0005539279 podman[214437]: 2025-11-29 00:54:30.098685653 +0000 UTC m=+0.236846546 container start 58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 19:54:30 np0005539279 neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a[214452]: [NOTICE]   (214457) : New worker (214459) forked
Nov 28 19:54:30 np0005539279 neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a[214452]: [NOTICE]   (214457) : Loading success.
Nov 28 19:54:31 np0005539279 nova_compute[187514]: 2025-11-29 00:54:31.534 187518 DEBUG nova.compute.manager [req-3cf8fd59-152d-4537-b585-d4e33293be13 req-4ecbf183-3575-423c-aeb2-f7b3694f4d52 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received event network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:31 np0005539279 nova_compute[187514]: 2025-11-29 00:54:31.534 187518 DEBUG oslo_concurrency.lockutils [req-3cf8fd59-152d-4537-b585-d4e33293be13 req-4ecbf183-3575-423c-aeb2-f7b3694f4d52 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:31 np0005539279 nova_compute[187514]: 2025-11-29 00:54:31.535 187518 DEBUG oslo_concurrency.lockutils [req-3cf8fd59-152d-4537-b585-d4e33293be13 req-4ecbf183-3575-423c-aeb2-f7b3694f4d52 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:31 np0005539279 nova_compute[187514]: 2025-11-29 00:54:31.535 187518 DEBUG oslo_concurrency.lockutils [req-3cf8fd59-152d-4537-b585-d4e33293be13 req-4ecbf183-3575-423c-aeb2-f7b3694f4d52 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:31 np0005539279 nova_compute[187514]: 2025-11-29 00:54:31.535 187518 DEBUG nova.compute.manager [req-3cf8fd59-152d-4537-b585-d4e33293be13 req-4ecbf183-3575-423c-aeb2-f7b3694f4d52 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] No waiting events found dispatching network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:54:31 np0005539279 nova_compute[187514]: 2025-11-29 00:54:31.535 187518 WARNING nova.compute.manager [req-3cf8fd59-152d-4537-b585-d4e33293be13 req-4ecbf183-3575-423c-aeb2-f7b3694f4d52 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received unexpected event network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:54:32 np0005539279 nova_compute[187514]: 2025-11-29 00:54:32.088 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:33 np0005539279 nova_compute[187514]: 2025-11-29 00:54:33.400 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:35 np0005539279 podman[214470]: 2025-11-29 00:54:35.862227461 +0000 UTC m=+0.089503731 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 19:54:35 np0005539279 podman[214471]: 2025-11-29 00:54:35.868799184 +0000 UTC m=+0.093005923 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 19:54:37 np0005539279 nova_compute[187514]: 2025-11-29 00:54:37.090 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:38 np0005539279 podman[214512]: 2025-11-29 00:54:38.127147359 +0000 UTC m=+0.089219442 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 19:54:38 np0005539279 nova_compute[187514]: 2025-11-29 00:54:38.403 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:40 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:40Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:92:c8 10.100.0.24
Nov 28 19:54:40 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:40Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:92:c8 10.100.0.24
Nov 28 19:54:42 np0005539279 nova_compute[187514]: 2025-11-29 00:54:42.093 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:43 np0005539279 nova_compute[187514]: 2025-11-29 00:54:43.405 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:47 np0005539279 nova_compute[187514]: 2025-11-29 00:54:47.096 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:48 np0005539279 nova_compute[187514]: 2025-11-29 00:54:48.406 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:49 np0005539279 podman[214545]: 2025-11-29 00:54:49.846458672 +0000 UTC m=+0.079305861 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 19:54:49 np0005539279 podman[214544]: 2025-11-29 00:54:49.871640897 +0000 UTC m=+0.106879805 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350)
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.603 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.605 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.605 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.605 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.606 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.607 187518 INFO nova.compute.manager [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Terminating instance#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.609 187518 DEBUG nova.compute.manager [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 19:54:51 np0005539279 kernel: tapcc367edf-e8 (unregistering): left promiscuous mode
Nov 28 19:54:51 np0005539279 NetworkManager[55703]: <info>  [1764377691.6357] device (tapcc367edf-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.683 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:51 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:51Z|00039|binding|INFO|Releasing lport cc367edf-e84b-4282-920c-6ef77203ac87 from this chassis (sb_readonly=0)
Nov 28 19:54:51 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:51Z|00040|binding|INFO|Setting lport cc367edf-e84b-4282-920c-6ef77203ac87 down in Southbound
Nov 28 19:54:51 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:51Z|00041|binding|INFO|Removing iface tapcc367edf-e8 ovn-installed in OVS
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.690 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:51.697 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:92:c8 10.100.0.24'], port_security=['fa:16:3e:52:92:c8 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'eb1fea54-15cd-4b1e-b337-a003cebd10a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-925da1c0-dca3-4483-abed-7d991383f88a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4760079f-c153-4c1c-8ecc-1f5684c79bde', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=92c45eae-62b1-4f3e-9274-94124b9489e2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=cc367edf-e84b-4282-920c-6ef77203ac87) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:54:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:51.700 104584 INFO neutron.agent.ovn.metadata.agent [-] Port cc367edf-e84b-4282-920c-6ef77203ac87 in datapath 925da1c0-dca3-4483-abed-7d991383f88a unbound from our chassis#033[00m
Nov 28 19:54:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:51.702 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 925da1c0-dca3-4483-abed-7d991383f88a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 19:54:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:51.704 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b5deb0c8-d861-4144-af59-09105eda689c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:51.705 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a namespace which is not needed anymore#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.712 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:51 np0005539279 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 28 19:54:51 np0005539279 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 11.757s CPU time.
Nov 28 19:54:51 np0005539279 systemd-machined[153752]: Machine qemu-2-instance-00000002 terminated.
Nov 28 19:54:51 np0005539279 podman[214591]: 2025-11-29 00:54:51.811422908 +0000 UTC m=+0.099663406 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 19:54:51 np0005539279 kernel: tapcc367edf-e8: entered promiscuous mode
Nov 28 19:54:51 np0005539279 kernel: tapcc367edf-e8 (unregistering): left promiscuous mode
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.848 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:51 np0005539279 podman[214588]: 2025-11-29 00:54:51.869070009 +0000 UTC m=+0.162931454 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.900 187518 INFO nova.virt.libvirt.driver [-] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Instance destroyed successfully.#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.900 187518 DEBUG nova.objects.instance [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid eb1fea54-15cd-4b1e-b337-a003cebd10a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:54:51 np0005539279 neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a[214452]: [NOTICE]   (214457) : haproxy version is 2.8.14-c23fe91
Nov 28 19:54:51 np0005539279 neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a[214452]: [NOTICE]   (214457) : path to executable is /usr/sbin/haproxy
Nov 28 19:54:51 np0005539279 neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a[214452]: [WARNING]  (214457) : Exiting Master process...
Nov 28 19:54:51 np0005539279 neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a[214452]: [ALERT]    (214457) : Current worker (214459) exited with code 143 (Terminated)
Nov 28 19:54:51 np0005539279 neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a[214452]: [WARNING]  (214457) : All workers exited. Exiting... (0)
Nov 28 19:54:51 np0005539279 systemd[1]: libpod-58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631.scope: Deactivated successfully.
Nov 28 19:54:51 np0005539279 podman[214651]: 2025-11-29 00:54:51.91917442 +0000 UTC m=+0.064150848 container died 58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.921 187518 DEBUG nova.virt.libvirt.vif [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:54:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-254322316',display_name='tempest-TestNetworkBasicOps-server-254322316',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-254322316',id=2,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBdva0q0s5ceda9a59M3GtwSIX0mJW7ry7y/WBh0RHasahghYL++H1QKD6ae6SxJlCZal9NqntPDy/ljOW143Zo/oyIADcOYf1BKZ9oTF+d4DUTLtujKQm/qmDGp6Td5qw==',key_name='tempest-TestNetworkBasicOps-787867870',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:54:29Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-1hzc0edf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:54:29Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=eb1fea54-15cd-4b1e-b337-a003cebd10a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.922 187518 DEBUG nova.network.os_vif_util [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "cc367edf-e84b-4282-920c-6ef77203ac87", "address": "fa:16:3e:52:92:c8", "network": {"id": "925da1c0-dca3-4483-abed-7d991383f88a", "bridge": "br-int", "label": "tempest-network-smoke--543309114", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc367edf-e8", "ovs_interfaceid": "cc367edf-e84b-4282-920c-6ef77203ac87", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.923 187518 DEBUG nova.network.os_vif_util [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:c8,bridge_name='br-int',has_traffic_filtering=True,id=cc367edf-e84b-4282-920c-6ef77203ac87,network=Network(925da1c0-dca3-4483-abed-7d991383f88a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc367edf-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.923 187518 DEBUG os_vif [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:c8,bridge_name='br-int',has_traffic_filtering=True,id=cc367edf-e84b-4282-920c-6ef77203ac87,network=Network(925da1c0-dca3-4483-abed-7d991383f88a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc367edf-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.925 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.925 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc367edf-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.927 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.929 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.933 187518 INFO os_vif [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:c8,bridge_name='br-int',has_traffic_filtering=True,id=cc367edf-e84b-4282-920c-6ef77203ac87,network=Network(925da1c0-dca3-4483-abed-7d991383f88a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc367edf-e8')#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.934 187518 INFO nova.virt.libvirt.driver [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Deleting instance files /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9_del#033[00m
Nov 28 19:54:51 np0005539279 nova_compute[187514]: 2025-11-29 00:54:51.935 187518 INFO nova.virt.libvirt.driver [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Deletion of /var/lib/nova/instances/eb1fea54-15cd-4b1e-b337-a003cebd10a9_del complete#033[00m
Nov 28 19:54:51 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631-userdata-shm.mount: Deactivated successfully.
Nov 28 19:54:51 np0005539279 systemd[1]: var-lib-containers-storage-overlay-616552baf2a84cdcc977c25e4b70e57bf20ac7fad1528893e75e8580141af502-merged.mount: Deactivated successfully.
Nov 28 19:54:51 np0005539279 podman[214651]: 2025-11-29 00:54:51.970226413 +0000 UTC m=+0.115202831 container cleanup 58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:54:51 np0005539279 systemd[1]: libpod-conmon-58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631.scope: Deactivated successfully.
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.021 187518 DEBUG nova.virt.libvirt.host [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.022 187518 INFO nova.virt.libvirt.host [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] UEFI support detected#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.025 187518 INFO nova.compute.manager [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.026 187518 DEBUG oslo.service.loopingcall [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.026 187518 DEBUG nova.compute.manager [-] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.027 187518 DEBUG nova.network.neutron [-] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 19:54:52 np0005539279 podman[214699]: 2025-11-29 00:54:52.074612115 +0000 UTC m=+0.069226047 container remove 58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.083 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[decd5513-641c-4f29-9a96-4681a2e13fbf]: (4, ('Sat Nov 29 12:54:51 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a (58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631)\n58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631\nSat Nov 29 12:54:51 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a (58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631)\n58db28c664617358488a4a554c12173bff59430c8ab2bf61e57077a4288dd631\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.085 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[6f4802a9-ee19-4b74-b10d-df440f4aa869]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.087 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap925da1c0-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.089 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:52 np0005539279 kernel: tap925da1c0-d0: left promiscuous mode
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.114 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.120 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[03279bfe-a117-4c26-a933-f237812ba61c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.135 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[4460c207-c5b8-4f50-8733-6340e3d7ca7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.137 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[23195cba-1d86-4b8e-a1be-7a22912447c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.161 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[767d6190-1fa4-4209-9334-1aa0f994f9ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360288, 'reachable_time': 24815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214713, 'error': None, 'target': 'ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:52 np0005539279 systemd[1]: run-netns-ovnmeta\x2d925da1c0\x2ddca3\x2d4483\x2dabed\x2d7d991383f88a.mount: Deactivated successfully.
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.178 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-925da1c0-dca3-4483-abed-7d991383f88a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 19:54:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:52.179 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[247a1c12-15af-49ef-a4f5-692c9c80d461]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.291 187518 DEBUG nova.compute.manager [req-9373b467-9d87-4554-a557-4996bb7e448f req-432c8d7a-39e9-4a5d-b71e-95e1e492c8bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received event network-vif-unplugged-cc367edf-e84b-4282-920c-6ef77203ac87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.292 187518 DEBUG oslo_concurrency.lockutils [req-9373b467-9d87-4554-a557-4996bb7e448f req-432c8d7a-39e9-4a5d-b71e-95e1e492c8bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.292 187518 DEBUG oslo_concurrency.lockutils [req-9373b467-9d87-4554-a557-4996bb7e448f req-432c8d7a-39e9-4a5d-b71e-95e1e492c8bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.293 187518 DEBUG oslo_concurrency.lockutils [req-9373b467-9d87-4554-a557-4996bb7e448f req-432c8d7a-39e9-4a5d-b71e-95e1e492c8bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.293 187518 DEBUG nova.compute.manager [req-9373b467-9d87-4554-a557-4996bb7e448f req-432c8d7a-39e9-4a5d-b71e-95e1e492c8bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] No waiting events found dispatching network-vif-unplugged-cc367edf-e84b-4282-920c-6ef77203ac87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.293 187518 DEBUG nova.compute.manager [req-9373b467-9d87-4554-a557-4996bb7e448f req-432c8d7a-39e9-4a5d-b71e-95e1e492c8bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received event network-vif-unplugged-cc367edf-e84b-4282-920c-6ef77203ac87 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.638 187518 DEBUG nova.network.neutron [-] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.664 187518 INFO nova.compute.manager [-] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Took 0.64 seconds to deallocate network for instance.#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.729 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.729 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.825 187518 DEBUG nova.compute.provider_tree [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.841 187518 DEBUG nova.scheduler.client.report [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.866 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.891 187518 INFO nova.scheduler.client.report [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance eb1fea54-15cd-4b1e-b337-a003cebd10a9#033[00m
Nov 28 19:54:52 np0005539279 nova_compute[187514]: 2025-11-29 00:54:52.973 187518 DEBUG oslo_concurrency.lockutils [None req-cd5d9c90-a487-4ab2-8878-6d834ca7fc20 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.368s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:53 np0005539279 nova_compute[187514]: 2025-11-29 00:54:53.081 187518 DEBUG nova.compute.manager [req-ed4245a6-6413-4a3a-902a-ddd519f2d249 req-1d35749b-1a91-4754-b8b3-8652dc2f9bfc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received event network-vif-deleted-cc367edf-e84b-4282-920c-6ef77203ac87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:54 np0005539279 nova_compute[187514]: 2025-11-29 00:54:54.402 187518 DEBUG nova.compute.manager [req-3c376070-e283-4ceb-8d9b-3238ff7b359d req-a585e7ed-bb92-41e8-b5f0-43ab2200272e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received event network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:54 np0005539279 nova_compute[187514]: 2025-11-29 00:54:54.403 187518 DEBUG oslo_concurrency.lockutils [req-3c376070-e283-4ceb-8d9b-3238ff7b359d req-a585e7ed-bb92-41e8-b5f0-43ab2200272e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:54 np0005539279 nova_compute[187514]: 2025-11-29 00:54:54.403 187518 DEBUG oslo_concurrency.lockutils [req-3c376070-e283-4ceb-8d9b-3238ff7b359d req-a585e7ed-bb92-41e8-b5f0-43ab2200272e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:54 np0005539279 nova_compute[187514]: 2025-11-29 00:54:54.404 187518 DEBUG oslo_concurrency.lockutils [req-3c376070-e283-4ceb-8d9b-3238ff7b359d req-a585e7ed-bb92-41e8-b5f0-43ab2200272e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "eb1fea54-15cd-4b1e-b337-a003cebd10a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:54 np0005539279 nova_compute[187514]: 2025-11-29 00:54:54.404 187518 DEBUG nova.compute.manager [req-3c376070-e283-4ceb-8d9b-3238ff7b359d req-a585e7ed-bb92-41e8-b5f0-43ab2200272e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] No waiting events found dispatching network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:54:54 np0005539279 nova_compute[187514]: 2025-11-29 00:54:54.405 187518 WARNING nova.compute.manager [req-3c376070-e283-4ceb-8d9b-3238ff7b359d req-a585e7ed-bb92-41e8-b5f0-43ab2200272e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Received unexpected event network-vif-plugged-cc367edf-e84b-4282-920c-6ef77203ac87 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 19:54:56 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:56Z|00042|binding|INFO|Releasing lport f9056d3b-5257-40f1-ae67-c7beec09428a from this chassis (sb_readonly=0)
Nov 28 19:54:56 np0005539279 nova_compute[187514]: 2025-11-29 00:54:56.739 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:56 np0005539279 nova_compute[187514]: 2025-11-29 00:54:56.927 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.121 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.787 187518 DEBUG nova.compute.manager [req-f3e43558-a7e8-4000-82f3-5e58615479a5 req-b89e3b05-9d23-47c6-a9df-bf82b253ff97 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received event network-changed-dc0d6f5b-4063-4940-975e-10b9379eb880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.788 187518 DEBUG nova.compute.manager [req-f3e43558-a7e8-4000-82f3-5e58615479a5 req-b89e3b05-9d23-47c6-a9df-bf82b253ff97 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Refreshing instance network info cache due to event network-changed-dc0d6f5b-4063-4940-975e-10b9379eb880. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.788 187518 DEBUG oslo_concurrency.lockutils [req-f3e43558-a7e8-4000-82f3-5e58615479a5 req-b89e3b05-9d23-47c6-a9df-bf82b253ff97 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.789 187518 DEBUG oslo_concurrency.lockutils [req-f3e43558-a7e8-4000-82f3-5e58615479a5 req-b89e3b05-9d23-47c6-a9df-bf82b253ff97 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.789 187518 DEBUG nova.network.neutron [req-f3e43558-a7e8-4000-82f3-5e58615479a5 req-b89e3b05-9d23-47c6-a9df-bf82b253ff97 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Refreshing network info cache for port dc0d6f5b-4063-4940-975e-10b9379eb880 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.901 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "b684198c-70de-4847-95da-9b3d77da7dbb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.903 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.903 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.904 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.904 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.907 187518 INFO nova.compute.manager [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Terminating instance#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.909 187518 DEBUG nova.compute.manager [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 19:54:57 np0005539279 kernel: tapdc0d6f5b-40 (unregistering): left promiscuous mode
Nov 28 19:54:57 np0005539279 NetworkManager[55703]: <info>  [1764377697.9371] device (tapdc0d6f5b-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 19:54:57 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:57Z|00043|binding|INFO|Releasing lport dc0d6f5b-4063-4940-975e-10b9379eb880 from this chassis (sb_readonly=0)
Nov 28 19:54:57 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:57Z|00044|binding|INFO|Setting lport dc0d6f5b-4063-4940-975e-10b9379eb880 down in Southbound
Nov 28 19:54:57 np0005539279 ovn_controller[95686]: 2025-11-29T00:54:57Z|00045|binding|INFO|Removing iface tapdc0d6f5b-40 ovn-installed in OVS
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.945 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:57.964 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:1a:67 10.100.0.9'], port_security=['fa:16:3e:5b:1a:67 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b684198c-70de-4847-95da-9b3d77da7dbb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4c74f40e-8f35-48f0-bee4-57a35c0924f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ddd9cfd4-609a-4c6d-8c17-5dc3fb4f927f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=127f73d2-2151-4a8e-987c-7618da8ab21d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=dc0d6f5b-4063-4940-975e-10b9379eb880) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:54:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:57.965 104584 INFO neutron.agent.ovn.metadata.agent [-] Port dc0d6f5b-4063-4940-975e-10b9379eb880 in datapath 4c74f40e-8f35-48f0-bee4-57a35c0924f2 unbound from our chassis#033[00m
Nov 28 19:54:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:57.966 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4c74f40e-8f35-48f0-bee4-57a35c0924f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 19:54:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:57.967 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[757c5dfe-eb5c-4cf9-9141-66cb619c319e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:57.968 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2 namespace which is not needed anymore#033[00m
Nov 28 19:54:57 np0005539279 nova_compute[187514]: 2025-11-29 00:54:57.968 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:58 np0005539279 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 28 19:54:58 np0005539279 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 14.767s CPU time.
Nov 28 19:54:58 np0005539279 systemd-machined[153752]: Machine qemu-1-instance-00000001 terminated.
Nov 28 19:54:58 np0005539279 neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2[214124]: [NOTICE]   (214128) : haproxy version is 2.8.14-c23fe91
Nov 28 19:54:58 np0005539279 neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2[214124]: [NOTICE]   (214128) : path to executable is /usr/sbin/haproxy
Nov 28 19:54:58 np0005539279 neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2[214124]: [WARNING]  (214128) : Exiting Master process...
Nov 28 19:54:58 np0005539279 neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2[214124]: [WARNING]  (214128) : Exiting Master process...
Nov 28 19:54:58 np0005539279 neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2[214124]: [ALERT]    (214128) : Current worker (214130) exited with code 143 (Terminated)
Nov 28 19:54:58 np0005539279 neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2[214124]: [WARNING]  (214128) : All workers exited. Exiting... (0)
Nov 28 19:54:58 np0005539279 systemd[1]: libpod-f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9.scope: Deactivated successfully.
Nov 28 19:54:58 np0005539279 podman[214740]: 2025-11-29 00:54:58.120613853 +0000 UTC m=+0.060054032 container died f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.139 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.145 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:58 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9-userdata-shm.mount: Deactivated successfully.
Nov 28 19:54:58 np0005539279 systemd[1]: var-lib-containers-storage-overlay-3fde8772c750be15c64c8fd727b2cb41f7bd0e1dbdb03209ec2485b98eaabb82-merged.mount: Deactivated successfully.
Nov 28 19:54:58 np0005539279 podman[214740]: 2025-11-29 00:54:58.173323531 +0000 UTC m=+0.112763720 container cleanup f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.195 187518 INFO nova.virt.libvirt.driver [-] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Instance destroyed successfully.#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.196 187518 DEBUG nova.objects.instance [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid b684198c-70de-4847-95da-9b3d77da7dbb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:54:58 np0005539279 systemd[1]: libpod-conmon-f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9.scope: Deactivated successfully.
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.213 187518 DEBUG nova.virt.libvirt.vif [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:53:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1039049178',display_name='tempest-TestNetworkBasicOps-server-1039049178',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1039049178',id=1,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHctOnxtO1cZjnywvziAgspEU7SiXv/37/3xfOAey/+qXIzu7yeRWuxik3GnzwZDqAYudEb2ozpm4Jl84nvxbVOaAVyNgscfEkyUwG86RbJ/uw52uW9+STd2w/CiuqFJAQ==',key_name='tempest-TestNetworkBasicOps-860229746',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:53:53Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-6s9ppta1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:53:53Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=b684198c-70de-4847-95da-9b3d77da7dbb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.214 187518 DEBUG nova.network.os_vif_util [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.215 187518 DEBUG nova.network.os_vif_util [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:1a:67,bridge_name='br-int',has_traffic_filtering=True,id=dc0d6f5b-4063-4940-975e-10b9379eb880,network=Network(4c74f40e-8f35-48f0-bee4-57a35c0924f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc0d6f5b-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.215 187518 DEBUG os_vif [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:1a:67,bridge_name='br-int',has_traffic_filtering=True,id=dc0d6f5b-4063-4940-975e-10b9379eb880,network=Network(4c74f40e-8f35-48f0-bee4-57a35c0924f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc0d6f5b-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 19:54:58 np0005539279 podman[214786]: 2025-11-29 00:54:58.254096689 +0000 UTC m=+0.039539842 container remove f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.260 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2fb45d46-02bd-4b35-a19e-e82fd10acf85]: (4, ('Sat Nov 29 12:54:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2 (f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9)\nf801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9\nSat Nov 29 12:54:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2 (f801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9)\nf801a2bab8071cb3de84818644b0edfc61645832f56f560cdb59db52dcc197d9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.262 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[85400eb3-c7ce-4e79-981f-ef628dc5ba77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.264 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c74f40e-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:58 np0005539279 kernel: tap4c74f40e-80: left promiscuous mode
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.295 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b49519a4-bac7-4a6b-ba9d-88df4eeb3d9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.309 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[4cdde3f1-284c-468c-bc1d-58f179df6758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.312 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0b585eed-12c3-42bc-a638-a6d196e7d648]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.332 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c92c9084-297c-432f-8e60-6aad73fd4d83]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 356969, 'reachable_time': 42552, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214805, 'error': None, 'target': 'ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:58 np0005539279 systemd[1]: run-netns-ovnmeta\x2d4c74f40e\x2d8f35\x2d48f0\x2dbee4\x2d57a35c0924f2.mount: Deactivated successfully.
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.338 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4c74f40e-8f35-48f0-bee4-57a35c0924f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 19:54:58 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:54:58.338 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[a65d0aab-a1f7-4959-a6bb-19e9c82916ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.341 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.342 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdc0d6f5b-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.342 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.343 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.345 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.347 187518 INFO os_vif [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:1a:67,bridge_name='br-int',has_traffic_filtering=True,id=dc0d6f5b-4063-4940-975e-10b9379eb880,network=Network(4c74f40e-8f35-48f0-bee4-57a35c0924f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdc0d6f5b-40')#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.347 187518 INFO nova.virt.libvirt.driver [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Deleting instance files /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb_del#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.348 187518 INFO nova.virt.libvirt.driver [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Deletion of /var/lib/nova/instances/b684198c-70de-4847-95da-9b3d77da7dbb_del complete#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.425 187518 INFO nova.compute.manager [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Took 0.52 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.426 187518 DEBUG oslo.service.loopingcall [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.426 187518 DEBUG nova.compute.manager [-] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 19:54:58 np0005539279 nova_compute[187514]: 2025-11-29 00:54:58.426 187518 DEBUG nova.network.neutron [-] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.484 187518 DEBUG nova.network.neutron [req-f3e43558-a7e8-4000-82f3-5e58615479a5 req-b89e3b05-9d23-47c6-a9df-bf82b253ff97 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updated VIF entry in instance network info cache for port dc0d6f5b-4063-4940-975e-10b9379eb880. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.485 187518 DEBUG nova.network.neutron [req-f3e43558-a7e8-4000-82f3-5e58615479a5 req-b89e3b05-9d23-47c6-a9df-bf82b253ff97 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updating instance_info_cache with network_info: [{"id": "dc0d6f5b-4063-4940-975e-10b9379eb880", "address": "fa:16:3e:5b:1a:67", "network": {"id": "4c74f40e-8f35-48f0-bee4-57a35c0924f2", "bridge": "br-int", "label": "tempest-network-smoke--1455309404", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdc0d6f5b-40", "ovs_interfaceid": "dc0d6f5b-4063-4940-975e-10b9379eb880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.507 187518 DEBUG nova.network.neutron [-] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.552 187518 INFO nova.compute.manager [-] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Took 1.13 seconds to deallocate network for instance.#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.564 187518 DEBUG oslo_concurrency.lockutils [req-f3e43558-a7e8-4000-82f3-5e58615479a5 req-b89e3b05-9d23-47c6-a9df-bf82b253ff97 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-b684198c-70de-4847-95da-9b3d77da7dbb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.603 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.604 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.659 187518 DEBUG nova.compute.provider_tree [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.680 187518 DEBUG nova.scheduler.client.report [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.709 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.730 187518 INFO nova.scheduler.client.report [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance b684198c-70de-4847-95da-9b3d77da7dbb#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.827 187518 DEBUG oslo_concurrency.lockutils [None req-6275d6e0-1319-44ae-9daa-86d93acc8631 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.894 187518 DEBUG nova.compute.manager [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received event network-vif-unplugged-dc0d6f5b-4063-4940-975e-10b9379eb880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.895 187518 DEBUG oslo_concurrency.lockutils [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.895 187518 DEBUG oslo_concurrency.lockutils [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.896 187518 DEBUG oslo_concurrency.lockutils [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.896 187518 DEBUG nova.compute.manager [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] No waiting events found dispatching network-vif-unplugged-dc0d6f5b-4063-4940-975e-10b9379eb880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.897 187518 WARNING nova.compute.manager [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received unexpected event network-vif-unplugged-dc0d6f5b-4063-4940-975e-10b9379eb880 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.897 187518 DEBUG nova.compute.manager [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received event network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.898 187518 DEBUG oslo_concurrency.lockutils [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.898 187518 DEBUG oslo_concurrency.lockutils [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.899 187518 DEBUG oslo_concurrency.lockutils [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "b684198c-70de-4847-95da-9b3d77da7dbb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.899 187518 DEBUG nova.compute.manager [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] No waiting events found dispatching network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.899 187518 WARNING nova.compute.manager [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received unexpected event network-vif-plugged-dc0d6f5b-4063-4940-975e-10b9379eb880 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 19:54:59 np0005539279 nova_compute[187514]: 2025-11-29 00:54:59.900 187518 DEBUG nova.compute.manager [req-a4cc6617-28fb-479d-9ebe-c063053dd535 req-739c09c5-f953-4c34-8d40-473d5bf84923 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Received event network-vif-deleted-dc0d6f5b-4063-4940-975e-10b9379eb880 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:55:02 np0005539279 nova_compute[187514]: 2025-11-29 00:55:02.123 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:03 np0005539279 nova_compute[187514]: 2025-11-29 00:55:03.343 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:03 np0005539279 nova_compute[187514]: 2025-11-29 00:55:03.697 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:03 np0005539279 nova_compute[187514]: 2025-11-29 00:55:03.802 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:06 np0005539279 podman[214813]: 2025-11-29 00:55:06.871717597 +0000 UTC m=+0.099748609 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:55:06 np0005539279 podman[214814]: 2025-11-29 00:55:06.8872157 +0000 UTC m=+0.112114968 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:55:06 np0005539279 nova_compute[187514]: 2025-11-29 00:55:06.897 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764377691.896641, eb1fea54-15cd-4b1e-b337-a003cebd10a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:55:06 np0005539279 nova_compute[187514]: 2025-11-29 00:55:06.898 187518 INFO nova.compute.manager [-] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] VM Stopped (Lifecycle Event)#033[00m
Nov 28 19:55:06 np0005539279 nova_compute[187514]: 2025-11-29 00:55:06.954 187518 DEBUG nova.compute.manager [None req-834c5be9-2ad4-4601-8291-d18f6ac6bdbc - - - - - -] [instance: eb1fea54-15cd-4b1e-b337-a003cebd10a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:55:07 np0005539279 nova_compute[187514]: 2025-11-29 00:55:07.126 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:08.087 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:08.088 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:08.089 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:08 np0005539279 nova_compute[187514]: 2025-11-29 00:55:08.345 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:08 np0005539279 podman[214857]: 2025-11-29 00:55:08.849365243 +0000 UTC m=+0.087696299 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 28 19:55:12 np0005539279 nova_compute[187514]: 2025-11-29 00:55:12.128 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:13 np0005539279 nova_compute[187514]: 2025-11-29 00:55:13.194 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764377698.1915445, b684198c-70de-4847-95da-9b3d77da7dbb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:55:13 np0005539279 nova_compute[187514]: 2025-11-29 00:55:13.194 187518 INFO nova.compute.manager [-] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] VM Stopped (Lifecycle Event)#033[00m
Nov 28 19:55:13 np0005539279 nova_compute[187514]: 2025-11-29 00:55:13.243 187518 DEBUG nova.compute.manager [None req-22d9bd37-4c91-4710-a5d4-23a8b432bfc2 - - - - - -] [instance: b684198c-70de-4847-95da-9b3d77da7dbb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:55:13 np0005539279 nova_compute[187514]: 2025-11-29 00:55:13.347 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:17 np0005539279 nova_compute[187514]: 2025-11-29 00:55:17.131 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:18 np0005539279 nova_compute[187514]: 2025-11-29 00:55:18.349 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:19 np0005539279 nova_compute[187514]: 2025-11-29 00:55:19.615 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:19 np0005539279 nova_compute[187514]: 2025-11-29 00:55:19.616 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:19 np0005539279 nova_compute[187514]: 2025-11-29 00:55:19.617 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:55:20 np0005539279 nova_compute[187514]: 2025-11-29 00:55:20.615 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:20 np0005539279 podman[214883]: 2025-11-29 00:55:20.848059375 +0000 UTC m=+0.086067315 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Nov 28 19:55:20 np0005539279 podman[214884]: 2025-11-29 00:55:20.863756215 +0000 UTC m=+0.097912067 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.133 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.636 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.637 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.688 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.689 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.689 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.690 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:55:22 np0005539279 podman[214931]: 2025-11-29 00:55:22.861728436 +0000 UTC m=+0.107354330 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 19:55:22 np0005539279 podman[214930]: 2025-11-29 00:55:22.901734003 +0000 UTC m=+0.154586607 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.952 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.954 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5791MB free_disk=73.34353256225586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.954 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:22 np0005539279 nova_compute[187514]: 2025-11-29 00:55:22.954 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:23 np0005539279 nova_compute[187514]: 2025-11-29 00:55:23.039 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:55:23 np0005539279 nova_compute[187514]: 2025-11-29 00:55:23.040 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:55:23 np0005539279 nova_compute[187514]: 2025-11-29 00:55:23.068 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:55:23 np0005539279 nova_compute[187514]: 2025-11-29 00:55:23.086 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:55:23 np0005539279 nova_compute[187514]: 2025-11-29 00:55:23.123 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:55:23 np0005539279 nova_compute[187514]: 2025-11-29 00:55:23.124 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:23 np0005539279 nova_compute[187514]: 2025-11-29 00:55:23.351 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:24 np0005539279 nova_compute[187514]: 2025-11-29 00:55:24.096 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:24 np0005539279 nova_compute[187514]: 2025-11-29 00:55:24.097 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:24 np0005539279 nova_compute[187514]: 2025-11-29 00:55:24.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:24 np0005539279 nova_compute[187514]: 2025-11-29 00:55:24.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:55:25 np0005539279 nova_compute[187514]: 2025-11-29 00:55:25.704 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "9c7d413a-3895-4195-aef1-97e4607b8046" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:25 np0005539279 nova_compute[187514]: 2025-11-29 00:55:25.705 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "9c7d413a-3895-4195-aef1-97e4607b8046" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:25 np0005539279 nova_compute[187514]: 2025-11-29 00:55:25.730 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 19:55:25 np0005539279 nova_compute[187514]: 2025-11-29 00:55:25.832 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:25 np0005539279 nova_compute[187514]: 2025-11-29 00:55:25.833 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:25 np0005539279 nova_compute[187514]: 2025-11-29 00:55:25.844 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 19:55:25 np0005539279 nova_compute[187514]: 2025-11-29 00:55:25.845 187518 INFO nova.compute.claims [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 19:55:25 np0005539279 nova_compute[187514]: 2025-11-29 00:55:25.978 187518 DEBUG nova.compute.provider_tree [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.003 187518 DEBUG nova.scheduler.client.report [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.035 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.036 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.137 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.137 187518 DEBUG nova.network.neutron [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.162 187518 INFO nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.211 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.335 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.337 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.338 187518 INFO nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Creating image(s)#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.339 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.339 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.340 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.356 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.451 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.452 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.453 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.468 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.560 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.561 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.607 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.609 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.609 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.698 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.700 187518 DEBUG nova.virt.disk.api [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.700 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.787 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.788 187518 DEBUG nova.virt.disk.api [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.789 187518 DEBUG nova.objects.instance [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c7d413a-3895-4195-aef1-97e4607b8046 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.812 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.813 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Ensure instance console log exists: /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.814 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.814 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:26 np0005539279 nova_compute[187514]: 2025-11-29 00:55:26.815 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:27 np0005539279 nova_compute[187514]: 2025-11-29 00:55:27.135 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:27 np0005539279 nova_compute[187514]: 2025-11-29 00:55:27.267 187518 DEBUG nova.policy [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:55:28 np0005539279 nova_compute[187514]: 2025-11-29 00:55:28.352 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:29 np0005539279 nova_compute[187514]: 2025-11-29 00:55:29.129 187518 DEBUG nova.network.neutron [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Successfully created port: d42bba49-9f7c-4a7a-b1a4-003f83b3b098 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:55:30 np0005539279 nova_compute[187514]: 2025-11-29 00:55:30.640 187518 DEBUG nova.network.neutron [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Successfully updated port: d42bba49-9f7c-4a7a-b1a4-003f83b3b098 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:55:30 np0005539279 nova_compute[187514]: 2025-11-29 00:55:30.669 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:55:30 np0005539279 nova_compute[187514]: 2025-11-29 00:55:30.669 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:55:30 np0005539279 nova_compute[187514]: 2025-11-29 00:55:30.670 187518 DEBUG nova.network.neutron [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:55:30 np0005539279 nova_compute[187514]: 2025-11-29 00:55:30.784 187518 DEBUG nova.compute.manager [req-2a800b7c-7c71-4ed1-ad67-dc7892a9d054 req-2095e7dd-65dd-4603-b919-0f7da32071d8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Received event network-changed-d42bba49-9f7c-4a7a-b1a4-003f83b3b098 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:55:30 np0005539279 nova_compute[187514]: 2025-11-29 00:55:30.785 187518 DEBUG nova.compute.manager [req-2a800b7c-7c71-4ed1-ad67-dc7892a9d054 req-2095e7dd-65dd-4603-b919-0f7da32071d8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Refreshing instance network info cache due to event network-changed-d42bba49-9f7c-4a7a-b1a4-003f83b3b098. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:55:30 np0005539279 nova_compute[187514]: 2025-11-29 00:55:30.785 187518 DEBUG oslo_concurrency.lockutils [req-2a800b7c-7c71-4ed1-ad67-dc7892a9d054 req-2095e7dd-65dd-4603-b919-0f7da32071d8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:55:31 np0005539279 nova_compute[187514]: 2025-11-29 00:55:31.285 187518 DEBUG nova.network.neutron [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.161 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.234 187518 DEBUG nova.network.neutron [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Updating instance_info_cache with network_info: [{"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.260 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.260 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Instance network_info: |[{"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.261 187518 DEBUG oslo_concurrency.lockutils [req-2a800b7c-7c71-4ed1-ad67-dc7892a9d054 req-2095e7dd-65dd-4603-b919-0f7da32071d8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.261 187518 DEBUG nova.network.neutron [req-2a800b7c-7c71-4ed1-ad67-dc7892a9d054 req-2095e7dd-65dd-4603-b919-0f7da32071d8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Refreshing network info cache for port d42bba49-9f7c-4a7a-b1a4-003f83b3b098 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.266 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Start _get_guest_xml network_info=[{"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.273 187518 WARNING nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.280 187518 DEBUG nova.virt.libvirt.host [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.281 187518 DEBUG nova.virt.libvirt.host [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.291 187518 DEBUG nova.virt.libvirt.host [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.291 187518 DEBUG nova.virt.libvirt.host [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.292 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.293 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.293 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.294 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.294 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.295 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.295 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.296 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.296 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.297 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.297 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.298 187518 DEBUG nova.virt.hardware [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.304 187518 DEBUG nova.virt.libvirt.vif [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:55:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744856111',display_name='tempest-TestNetworkBasicOps-server-1744856111',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744856111',id=3,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAXuaxJxjf8H26ARN7YBrYS08zZ+yQNDawKLwF7THYFd4EiTUlJWrDIpSFpWoQh7y68ms8CYgpxHpnj+zm1gLdRd7Hzj803Ppy+udMAz+74E4qDaifY56L98SohD29Bwkw==',key_name='tempest-TestNetworkBasicOps-1739039869',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-krrll58u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:55:26Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=9c7d413a-3895-4195-aef1-97e4607b8046,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.304 187518 DEBUG nova.network.os_vif_util [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.306 187518 DEBUG nova.network.os_vif_util [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:6e:9a,bridge_name='br-int',has_traffic_filtering=True,id=d42bba49-9f7c-4a7a-b1a4-003f83b3b098,network=Network(65f87900-2702-4980-b308-8d80b8ae4722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd42bba49-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.307 187518 DEBUG nova.objects.instance [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c7d413a-3895-4195-aef1-97e4607b8046 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:55:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.332 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] End _get_guest_xml xml=<domain type="kvm">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <uuid>9c7d413a-3895-4195-aef1-97e4607b8046</uuid>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <name>instance-00000003</name>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-1744856111</nova:name>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 00:55:32</nova:creationTime>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        <nova:port uuid="d42bba49-9f7c-4a7a-b1a4-003f83b3b098">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <system>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <entry name="serial">9c7d413a-3895-4195-aef1-97e4607b8046</entry>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <entry name="uuid">9c7d413a-3895-4195-aef1-97e4607b8046</entry>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </system>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <os>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  </clock>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk.config"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:d5:6e:9a"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <target dev="tapd42bba49-9f"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/console.log" append="off"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </serial>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <video>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 19:55:32 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 19:55:32 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:55:32 np0005539279 nova_compute[187514]: </domain>
Nov 28 19:55:32 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.335 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Preparing to wait for external event network-vif-plugged-d42bba49-9f7c-4a7a-b1a4-003f83b3b098 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.335 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.336 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.336 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.337 187518 DEBUG nova.virt.libvirt.vif [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:55:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744856111',display_name='tempest-TestNetworkBasicOps-server-1744856111',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744856111',id=3,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAXuaxJxjf8H26ARN7YBrYS08zZ+yQNDawKLwF7THYFd4EiTUlJWrDIpSFpWoQh7y68ms8CYgpxHpnj+zm1gLdRd7Hzj803Ppy+udMAz+74E4qDaifY56L98SohD29Bwkw==',key_name='tempest-TestNetworkBasicOps-1739039869',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-krrll58u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:55:26Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=9c7d413a-3895-4195-aef1-97e4607b8046,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.338 187518 DEBUG nova.network.os_vif_util [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.339 187518 DEBUG nova.network.os_vif_util [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:6e:9a,bridge_name='br-int',has_traffic_filtering=True,id=d42bba49-9f7c-4a7a-b1a4-003f83b3b098,network=Network(65f87900-2702-4980-b308-8d80b8ae4722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd42bba49-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.339 187518 DEBUG os_vif [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:6e:9a,bridge_name='br-int',has_traffic_filtering=True,id=d42bba49-9f7c-4a7a-b1a4-003f83b3b098,network=Network(65f87900-2702-4980-b308-8d80b8ae4722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd42bba49-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.340 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.341 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.341 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.346 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.346 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd42bba49-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.347 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd42bba49-9f, col_values=(('external_ids', {'iface-id': 'd42bba49-9f7c-4a7a-b1a4-003f83b3b098', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:6e:9a', 'vm-uuid': '9c7d413a-3895-4195-aef1-97e4607b8046'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.349 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:32 np0005539279 NetworkManager[55703]: <info>  [1764377732.3514] manager: (tapd42bba49-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.373 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.375 187518 INFO os_vif [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:6e:9a,bridge_name='br-int',has_traffic_filtering=True,id=d42bba49-9f7c-4a7a-b1a4-003f83b3b098,network=Network(65f87900-2702-4980-b308-8d80b8ae4722),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd42bba49-9f')#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.465 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.466 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.466 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:d5:6e:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.467 187518 INFO nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Using config drive#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.779 187518 INFO nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Creating config drive at /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk.config#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.786 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz7nbq653 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:55:32 np0005539279 nova_compute[187514]: 2025-11-29 00:55:32.928 187518 DEBUG oslo_concurrency.processutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c7d413a-3895-4195-aef1-97e4607b8046/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz7nbq653" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:55:33 np0005539279 kernel: tapd42bba49-9f: entered promiscuous mode
Nov 28 19:55:33 np0005539279 NetworkManager[55703]: <info>  [1764377733.0196] manager: (tapd42bba49-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Nov 28 19:55:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:33Z|00046|binding|INFO|Claiming lport d42bba49-9f7c-4a7a-b1a4-003f83b3b098 for this chassis.
Nov 28 19:55:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:33Z|00047|binding|INFO|d42bba49-9f7c-4a7a-b1a4-003f83b3b098: Claiming fa:16:3e:d5:6e:9a 10.100.0.8
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.021 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.027 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.032 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.045 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:6e:9a 10.100.0.8'], port_security=['fa:16:3e:d5:6e:9a 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '9c7d413a-3895-4195-aef1-97e4607b8046', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-65f87900-2702-4980-b308-8d80b8ae4722', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e2738c58-a4a2-40b8-92ec-f3ad302dbb30', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24c19f34-5566-47de-89d9-56e6f1b38d31, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=d42bba49-9f7c-4a7a-b1a4-003f83b3b098) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.047 104584 INFO neutron.agent.ovn.metadata.agent [-] Port d42bba49-9f7c-4a7a-b1a4-003f83b3b098 in datapath 65f87900-2702-4980-b308-8d80b8ae4722 bound to our chassis#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.048 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 65f87900-2702-4980-b308-8d80b8ae4722#033[00m
Nov 28 19:55:33 np0005539279 systemd-udevd[215010]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.066 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0882f1d4-bbe2-4086-a91a-1810cbd0edee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.068 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap65f87900-21 in ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.070 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap65f87900-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.070 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[d7060971-8114-4899-803e-726720c5f3cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.071 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2d2f30-6d01-4224-8c43-d524feace69b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 systemd-machined[153752]: New machine qemu-3-instance-00000003.
Nov 28 19:55:33 np0005539279 NetworkManager[55703]: <info>  [1764377733.0848] device (tapd42bba49-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:55:33 np0005539279 NetworkManager[55703]: <info>  [1764377733.0867] device (tapd42bba49-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.090 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6176e0-67c3-483e-b3da-6fceae659145]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.119 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:33Z|00048|binding|INFO|Setting lport d42bba49-9f7c-4a7a-b1a4-003f83b3b098 ovn-installed in OVS
Nov 28 19:55:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:33Z|00049|binding|INFO|Setting lport d42bba49-9f7c-4a7a-b1a4-003f83b3b098 up in Southbound
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.124 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.129 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[8834f063-573f-40e6-ae38-55c4bdd67a03]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.166 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[820e11f4-6238-4da7-878e-8fa5e40ce1a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 NetworkManager[55703]: <info>  [1764377733.1757] manager: (tap65f87900-20): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.174 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[4da08428-47b7-4545-b8b3-9b7b944239ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.235 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[15b2fb38-7e18-4665-881d-e250cc9227c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.239 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bfb9b5-848a-42fa-a2e2-0c21863cde63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 NetworkManager[55703]: <info>  [1764377733.2804] device (tap65f87900-20): carrier: link connected
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.285 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcb9e1e-6c9e-4b6f-9443-64a4e24d3ae9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.311 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[a5177e18-9a2b-4de5-8ebb-d695e70fff34]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65f87900-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:4e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366698, 'reachable_time': 18190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215043, 'error': None, 'target': 'ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.342 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[4d278bc4-7aa1-4c8a-8cb1-2ff63c6e4a15]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:4e23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 366698, 'tstamp': 366698}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215045, 'error': None, 'target': 'ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.371 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a758b3-ccdd-4ed3-8f51-2fc991837a9b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap65f87900-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:4e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 366698, 'reachable_time': 18190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215048, 'error': None, 'target': 'ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.416 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ec7abb-5d81-42e4-9ebe-816a457fa322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.431 187518 DEBUG nova.compute.manager [req-a44fe417-106c-4ebf-9a64-d28a03c5b44c req-8a1c352f-2255-4229-9a04-5feb386010b8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Received event network-vif-plugged-d42bba49-9f7c-4a7a-b1a4-003f83b3b098 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.432 187518 DEBUG oslo_concurrency.lockutils [req-a44fe417-106c-4ebf-9a64-d28a03c5b44c req-8a1c352f-2255-4229-9a04-5feb386010b8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.433 187518 DEBUG oslo_concurrency.lockutils [req-a44fe417-106c-4ebf-9a64-d28a03c5b44c req-8a1c352f-2255-4229-9a04-5feb386010b8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.433 187518 DEBUG oslo_concurrency.lockutils [req-a44fe417-106c-4ebf-9a64-d28a03c5b44c req-8a1c352f-2255-4229-9a04-5feb386010b8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.434 187518 DEBUG nova.compute.manager [req-a44fe417-106c-4ebf-9a64-d28a03c5b44c req-8a1c352f-2255-4229-9a04-5feb386010b8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Processing event network-vif-plugged-d42bba49-9f7c-4a7a-b1a4-003f83b3b098 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.457 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377733.4568105, 9c7d413a-3895-4195-aef1-97e4607b8046 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.458 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] VM Started (Lifecycle Event)#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.466 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.477 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.485 187518 INFO nova.virt.libvirt.driver [-] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Instance spawned successfully.#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.487 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.491 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.496 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.506 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0e379a54-3a70-4ce8-ac34-7ac2a12f963c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.508 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap65f87900-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.509 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.509 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap65f87900-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:55:33 np0005539279 NetworkManager[55703]: <info>  [1764377733.5133] manager: (tap65f87900-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.512 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 kernel: tap65f87900-20: entered promiscuous mode
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.518 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap65f87900-20, col_values=(('external_ids', {'iface-id': 'dfcc5378-8987-42ff-ae91-870f70e1287e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.519 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:33Z|00050|binding|INFO|Releasing lport dfcc5378-8987-42ff-ae91-870f70e1287e from this chassis (sb_readonly=0)
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.523 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.524 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/65f87900-2702-4980-b308-8d80b8ae4722.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/65f87900-2702-4980-b308-8d80b8ae4722.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.525 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[efa5e245-f2ac-49db-8635-72d6607258c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.526 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-65f87900-2702-4980-b308-8d80b8ae4722
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/65f87900-2702-4980-b308-8d80b8ae4722.pid.haproxy
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.525 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.526 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377733.4570339, 9c7d413a-3895-4195-aef1-97e4607b8046 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.526 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] VM Paused (Lifecycle Event)#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 65f87900-2702-4980-b308-8d80b8ae4722
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 19:55:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:33.527 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722', 'env', 'PROCESS_TAG=haproxy-65f87900-2702-4980-b308-8d80b8ae4722', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/65f87900-2702-4980-b308-8d80b8ae4722.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.530 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.530 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.530 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.531 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.532 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.532 187518 DEBUG nova.virt.libvirt.driver [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.548 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.565 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.570 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377733.4706612, 9c7d413a-3895-4195-aef1-97e4607b8046 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.570 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] VM Resumed (Lifecycle Event)#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.601 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.608 187518 INFO nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Took 7.27 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.608 187518 DEBUG nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.609 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.628 187518 DEBUG nova.network.neutron [req-2a800b7c-7c71-4ed1-ad67-dc7892a9d054 req-2095e7dd-65dd-4603-b919-0f7da32071d8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Updated VIF entry in instance network info cache for port d42bba49-9f7c-4a7a-b1a4-003f83b3b098. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.629 187518 DEBUG nova.network.neutron [req-2a800b7c-7c71-4ed1-ad67-dc7892a9d054 req-2095e7dd-65dd-4603-b919-0f7da32071d8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Updating instance_info_cache with network_info: [{"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.665 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.687 187518 DEBUG oslo_concurrency.lockutils [req-2a800b7c-7c71-4ed1-ad67-dc7892a9d054 req-2095e7dd-65dd-4603-b919-0f7da32071d8 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.699 187518 INFO nova.compute.manager [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Took 7.89 seconds to build instance.#033[00m
Nov 28 19:55:33 np0005539279 nova_compute[187514]: 2025-11-29 00:55:33.720 187518 DEBUG oslo_concurrency.lockutils [None req-398757d1-7f28-492b-a109-254905caf42b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "9c7d413a-3895-4195-aef1-97e4607b8046" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.015s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:33 np0005539279 podman[215085]: 2025-11-29 00:55:33.982289018 +0000 UTC m=+0.075426472 container create 743107f4508a5c20f7366343571fd5781566ff304e0320709f7c8cd5eefd58f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:55:34 np0005539279 podman[215085]: 2025-11-29 00:55:33.944063331 +0000 UTC m=+0.037200845 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:55:34 np0005539279 systemd[1]: Started libpod-conmon-743107f4508a5c20f7366343571fd5781566ff304e0320709f7c8cd5eefd58f2.scope.
Nov 28 19:55:34 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:55:34 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/017e8abddd55d3ca8fd11d1985af12fe46da6fb0077485f99e2d06fd8f3ef415/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 19:55:34 np0005539279 podman[215085]: 2025-11-29 00:55:34.112215196 +0000 UTC m=+0.205352720 container init 743107f4508a5c20f7366343571fd5781566ff304e0320709f7c8cd5eefd58f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 19:55:34 np0005539279 podman[215085]: 2025-11-29 00:55:34.124022338 +0000 UTC m=+0.217159792 container start 743107f4508a5c20f7366343571fd5781566ff304e0320709f7c8cd5eefd58f2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 19:55:34 np0005539279 neutron-haproxy-ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722[215100]: [NOTICE]   (215104) : New worker (215106) forked
Nov 28 19:55:34 np0005539279 neutron-haproxy-ovnmeta-65f87900-2702-4980-b308-8d80b8ae4722[215100]: [NOTICE]   (215104) : Loading success.
Nov 28 19:55:35 np0005539279 nova_compute[187514]: 2025-11-29 00:55:35.530 187518 DEBUG nova.compute.manager [req-642c064a-efa0-4849-800b-81b3184d0ce6 req-642b79ed-9de4-4f1f-8880-dd1349e8668a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Received event network-vif-plugged-d42bba49-9f7c-4a7a-b1a4-003f83b3b098 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:55:35 np0005539279 nova_compute[187514]: 2025-11-29 00:55:35.531 187518 DEBUG oslo_concurrency.lockutils [req-642c064a-efa0-4849-800b-81b3184d0ce6 req-642b79ed-9de4-4f1f-8880-dd1349e8668a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:35 np0005539279 nova_compute[187514]: 2025-11-29 00:55:35.531 187518 DEBUG oslo_concurrency.lockutils [req-642c064a-efa0-4849-800b-81b3184d0ce6 req-642b79ed-9de4-4f1f-8880-dd1349e8668a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:35 np0005539279 nova_compute[187514]: 2025-11-29 00:55:35.531 187518 DEBUG oslo_concurrency.lockutils [req-642c064a-efa0-4849-800b-81b3184d0ce6 req-642b79ed-9de4-4f1f-8880-dd1349e8668a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "9c7d413a-3895-4195-aef1-97e4607b8046-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:55:35 np0005539279 nova_compute[187514]: 2025-11-29 00:55:35.532 187518 DEBUG nova.compute.manager [req-642c064a-efa0-4849-800b-81b3184d0ce6 req-642b79ed-9de4-4f1f-8880-dd1349e8668a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] No waiting events found dispatching network-vif-plugged-d42bba49-9f7c-4a7a-b1a4-003f83b3b098 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:55:35 np0005539279 nova_compute[187514]: 2025-11-29 00:55:35.532 187518 WARNING nova.compute.manager [req-642c064a-efa0-4849-800b-81b3184d0ce6 req-642b79ed-9de4-4f1f-8880-dd1349e8668a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Received unexpected event network-vif-plugged-d42bba49-9f7c-4a7a-b1a4-003f83b3b098 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:55:37 np0005539279 nova_compute[187514]: 2025-11-29 00:55:37.163 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:37 np0005539279 nova_compute[187514]: 2025-11-29 00:55:37.349 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:37 np0005539279 nova_compute[187514]: 2025-11-29 00:55:37.410 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:37 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:37.412 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:55:37 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:37.414 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 19:55:37 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:37Z|00051|binding|INFO|Releasing lport dfcc5378-8987-42ff-ae91-870f70e1287e from this chassis (sb_readonly=0)
Nov 28 19:55:37 np0005539279 NetworkManager[55703]: <info>  [1764377737.5596] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Nov 28 19:55:37 np0005539279 NetworkManager[55703]: <info>  [1764377737.5613] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 28 19:55:37 np0005539279 nova_compute[187514]: 2025-11-29 00:55:37.581 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:37 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:37Z|00052|binding|INFO|Releasing lport dfcc5378-8987-42ff-ae91-870f70e1287e from this chassis (sb_readonly=0)
Nov 28 19:55:37 np0005539279 nova_compute[187514]: 2025-11-29 00:55:37.600 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:37 np0005539279 nova_compute[187514]: 2025-11-29 00:55:37.607 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:37 np0005539279 podman[215117]: 2025-11-29 00:55:37.858256279 +0000 UTC m=+0.099309453 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Nov 28 19:55:37 np0005539279 podman[215116]: 2025-11-29 00:55:37.865661155 +0000 UTC m=+0.107131323 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:55:38 np0005539279 nova_compute[187514]: 2025-11-29 00:55:38.042 187518 DEBUG nova.compute.manager [req-37059bf8-fd98-419e-b2b0-e13ba5f02cf1 req-e5ba5f30-c604-4e6b-9c43-2db3beb80dac 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Received event network-changed-d42bba49-9f7c-4a7a-b1a4-003f83b3b098 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:55:38 np0005539279 nova_compute[187514]: 2025-11-29 00:55:38.043 187518 DEBUG nova.compute.manager [req-37059bf8-fd98-419e-b2b0-e13ba5f02cf1 req-e5ba5f30-c604-4e6b-9c43-2db3beb80dac 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Refreshing instance network info cache due to event network-changed-d42bba49-9f7c-4a7a-b1a4-003f83b3b098. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:55:38 np0005539279 nova_compute[187514]: 2025-11-29 00:55:38.043 187518 DEBUG oslo_concurrency.lockutils [req-37059bf8-fd98-419e-b2b0-e13ba5f02cf1 req-e5ba5f30-c604-4e6b-9c43-2db3beb80dac 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:55:38 np0005539279 nova_compute[187514]: 2025-11-29 00:55:38.044 187518 DEBUG oslo_concurrency.lockutils [req-37059bf8-fd98-419e-b2b0-e13ba5f02cf1 req-e5ba5f30-c604-4e6b-9c43-2db3beb80dac 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:55:38 np0005539279 nova_compute[187514]: 2025-11-29 00:55:38.044 187518 DEBUG nova.network.neutron [req-37059bf8-fd98-419e-b2b0-e13ba5f02cf1 req-e5ba5f30-c604-4e6b-9c43-2db3beb80dac 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Refreshing network info cache for port d42bba49-9f7c-4a7a-b1a4-003f83b3b098 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:55:39 np0005539279 nova_compute[187514]: 2025-11-29 00:55:39.478 187518 DEBUG nova.network.neutron [req-37059bf8-fd98-419e-b2b0-e13ba5f02cf1 req-e5ba5f30-c604-4e6b-9c43-2db3beb80dac 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Updated VIF entry in instance network info cache for port d42bba49-9f7c-4a7a-b1a4-003f83b3b098. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:55:39 np0005539279 nova_compute[187514]: 2025-11-29 00:55:39.478 187518 DEBUG nova.network.neutron [req-37059bf8-fd98-419e-b2b0-e13ba5f02cf1 req-e5ba5f30-c604-4e6b-9c43-2db3beb80dac 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Updating instance_info_cache with network_info: [{"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:55:39 np0005539279 nova_compute[187514]: 2025-11-29 00:55:39.507 187518 DEBUG oslo_concurrency.lockutils [req-37059bf8-fd98-419e-b2b0-e13ba5f02cf1 req-e5ba5f30-c604-4e6b-9c43-2db3beb80dac 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:55:39 np0005539279 podman[215162]: 2025-11-29 00:55:39.844820811 +0000 UTC m=+0.089811019 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:55:42 np0005539279 nova_compute[187514]: 2025-11-29 00:55:42.168 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:42 np0005539279 nova_compute[187514]: 2025-11-29 00:55:42.353 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:44 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:44Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d5:6e:9a 10.100.0.8
Nov 28 19:55:44 np0005539279 ovn_controller[95686]: 2025-11-29T00:55:44Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d5:6e:9a 10.100.0.8
Nov 28 19:55:44 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:55:44.418 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:55:47 np0005539279 nova_compute[187514]: 2025-11-29 00:55:47.181 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:47 np0005539279 nova_compute[187514]: 2025-11-29 00:55:47.355 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:50 np0005539279 nova_compute[187514]: 2025-11-29 00:55:50.843 187518 INFO nova.compute.manager [None req-c90f1fca-ed0f-4104-b1bc-593a55c9d9a2 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Get console output#033[00m
Nov 28 19:55:50 np0005539279 nova_compute[187514]: 2025-11-29 00:55:50.854 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 19:55:51 np0005539279 podman[215199]: 2025-11-29 00:55:51.267833613 +0000 UTC m=+0.088039650 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:55:51 np0005539279 podman[215198]: 2025-11-29 00:55:51.278838828 +0000 UTC m=+0.102169799 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Nov 28 19:55:52 np0005539279 nova_compute[187514]: 2025-11-29 00:55:52.183 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:52 np0005539279 nova_compute[187514]: 2025-11-29 00:55:52.357 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:53 np0005539279 podman[215244]: 2025-11-29 00:55:53.875286693 +0000 UTC m=+0.088364901 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 19:55:53 np0005539279 podman[215243]: 2025-11-29 00:55:53.904066187 +0000 UTC m=+0.126431003 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller)
Nov 28 19:55:54 np0005539279 nova_compute[187514]: 2025-11-29 00:55:54.030 187518 DEBUG oslo_concurrency.lockutils [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "interface-9c7d413a-3895-4195-aef1-97e4607b8046-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:55:54 np0005539279 nova_compute[187514]: 2025-11-29 00:55:54.031 187518 DEBUG oslo_concurrency.lockutils [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "interface-9c7d413a-3895-4195-aef1-97e4607b8046-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:55:54 np0005539279 nova_compute[187514]: 2025-11-29 00:55:54.031 187518 DEBUG nova.objects.instance [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'flavor' on Instance uuid 9c7d413a-3895-4195-aef1-97e4607b8046 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:55:55 np0005539279 nova_compute[187514]: 2025-11-29 00:55:55.310 187518 DEBUG nova.objects.instance [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_requests' on Instance uuid 9c7d413a-3895-4195-aef1-97e4607b8046 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:55:55 np0005539279 nova_compute[187514]: 2025-11-29 00:55:55.333 187518 DEBUG nova.network.neutron [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:55:56 np0005539279 nova_compute[187514]: 2025-11-29 00:55:56.323 187518 DEBUG nova.policy [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:55:57 np0005539279 nova_compute[187514]: 2025-11-29 00:55:57.185 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:57 np0005539279 nova_compute[187514]: 2025-11-29 00:55:57.359 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:55:58 np0005539279 nova_compute[187514]: 2025-11-29 00:55:58.523 187518 DEBUG nova.network.neutron [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Successfully created port: afce301e-b341-464f-9f41-fad9a37c454d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:55:59 np0005539279 nova_compute[187514]: 2025-11-29 00:55:59.165 187518 DEBUG nova.network.neutron [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Successfully updated port: afce301e-b341-464f-9f41-fad9a37c454d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:55:59 np0005539279 nova_compute[187514]: 2025-11-29 00:55:59.181 187518 DEBUG oslo_concurrency.lockutils [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:55:59 np0005539279 nova_compute[187514]: 2025-11-29 00:55:59.182 187518 DEBUG oslo_concurrency.lockutils [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:55:59 np0005539279 nova_compute[187514]: 2025-11-29 00:55:59.182 187518 DEBUG nova.network.neutron [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:55:59 np0005539279 nova_compute[187514]: 2025-11-29 00:55:59.281 187518 DEBUG nova.compute.manager [req-6289bcfa-6316-42c6-8d42-d4533c7199f7 req-539c9d95-8a5b-4ed5-96e6-a3e6c0da2e47 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Received event network-changed-afce301e-b341-464f-9f41-fad9a37c454d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:55:59 np0005539279 nova_compute[187514]: 2025-11-29 00:55:59.282 187518 DEBUG nova.compute.manager [req-6289bcfa-6316-42c6-8d42-d4533c7199f7 req-539c9d95-8a5b-4ed5-96e6-a3e6c0da2e47 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Refreshing instance network info cache due to event network-changed-afce301e-b341-464f-9f41-fad9a37c454d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:55:59 np0005539279 nova_compute[187514]: 2025-11-29 00:55:59.282 187518 DEBUG oslo_concurrency.lockutils [req-6289bcfa-6316-42c6-8d42-d4533c7199f7 req-539c9d95-8a5b-4ed5-96e6-a3e6c0da2e47 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.787 187518 DEBUG nova.network.neutron [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Updating instance_info_cache with network_info: [{"id": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "address": "fa:16:3e:d5:6e:9a", "network": {"id": "65f87900-2702-4980-b308-8d80b8ae4722", "bridge": "br-int", "label": "tempest-network-smoke--907951046", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd42bba49-9f", "ovs_interfaceid": "d42bba49-9f7c-4a7a-b1a4-003f83b3b098", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "afce301e-b341-464f-9f41-fad9a37c454d", "address": "fa:16:3e:86:1d:7d", "network": {"id": "9b411d83-42cd-41ac-bdaa-ce305ed6daf9", "bridge": "br-int", "label": "tempest-network-smoke--584081957", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafce301e-b3", "ovs_interfaceid": "afce301e-b341-464f-9f41-fad9a37c454d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.817 187518 DEBUG oslo_concurrency.lockutils [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.818 187518 DEBUG oslo_concurrency.lockutils [req-6289bcfa-6316-42c6-8d42-d4533c7199f7 req-539c9d95-8a5b-4ed5-96e6-a3e6c0da2e47 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-9c7d413a-3895-4195-aef1-97e4607b8046" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.819 187518 DEBUG nova.network.neutron [req-6289bcfa-6316-42c6-8d42-d4533c7199f7 req-539c9d95-8a5b-4ed5-96e6-a3e6c0da2e47 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Refreshing network info cache for port afce301e-b341-464f-9f41-fad9a37c454d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.821 187518 DEBUG nova.virt.libvirt.vif [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:55:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744856111',display_name='tempest-TestNetworkBasicOps-server-1744856111',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744856111',id=3,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAXuaxJxjf8H26ARN7YBrYS08zZ+yQNDawKLwF7THYFd4EiTUlJWrDIpSFpWoQh7y68ms8CYgpxHpnj+zm1gLdRd7Hzj803Ppy+udMAz+74E4qDaifY56L98SohD29Bwkw==',key_name='tempest-TestNetworkBasicOps-1739039869',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:55:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-krrll58u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:55:33Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=9c7d413a-3895-4195-aef1-97e4607b8046,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afce301e-b341-464f-9f41-fad9a37c454d", "address": "fa:16:3e:86:1d:7d", "network": {"id": "9b411d83-42cd-41ac-bdaa-ce305ed6daf9", "bridge": "br-int", "label": "tempest-network-smoke--584081957", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafce301e-b3", "ovs_interfaceid": "afce301e-b341-464f-9f41-fad9a37c454d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.822 187518 DEBUG nova.network.os_vif_util [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "afce301e-b341-464f-9f41-fad9a37c454d", "address": "fa:16:3e:86:1d:7d", "network": {"id": "9b411d83-42cd-41ac-bdaa-ce305ed6daf9", "bridge": "br-int", "label": "tempest-network-smoke--584081957", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafce301e-b3", "ovs_interfaceid": "afce301e-b341-464f-9f41-fad9a37c454d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.822 187518 DEBUG nova.network.os_vif_util [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:1d:7d,bridge_name='br-int',has_traffic_filtering=True,id=afce301e-b341-464f-9f41-fad9a37c454d,network=Network(9b411d83-42cd-41ac-bdaa-ce305ed6daf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafce301e-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.822 187518 DEBUG os_vif [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:1d:7d,bridge_name='br-int',has_traffic_filtering=True,id=afce301e-b341-464f-9f41-fad9a37c454d,network=Network(9b411d83-42cd-41ac-bdaa-ce305ed6daf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafce301e-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.823 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.823 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.824 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.826 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.827 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafce301e-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.827 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapafce301e-b3, col_values=(('external_ids', {'iface-id': 'afce301e-b341-464f-9f41-fad9a37c454d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:1d:7d', 'vm-uuid': '9c7d413a-3895-4195-aef1-97e4607b8046'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.854 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:01 np0005539279 NetworkManager[55703]: <info>  [1764377761.8568] manager: (tapafce301e-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.858 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.864 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.865 187518 INFO os_vif [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:1d:7d,bridge_name='br-int',has_traffic_filtering=True,id=afce301e-b341-464f-9f41-fad9a37c454d,network=Network(9b411d83-42cd-41ac-bdaa-ce305ed6daf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafce301e-b3')#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.867 187518 DEBUG nova.virt.libvirt.vif [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:55:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744856111',display_name='tempest-TestNetworkBasicOps-server-1744856111',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744856111',id=3,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAXuaxJxjf8H26ARN7YBrYS08zZ+yQNDawKLwF7THYFd4EiTUlJWrDIpSFpWoQh7y68ms8CYgpxHpnj+zm1gLdRd7Hzj803Ppy+udMAz+74E4qDaifY56L98SohD29Bwkw==',key_name='tempest-TestNetworkBasicOps-1739039869',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:55:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-krrll58u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:55:33Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=9c7d413a-3895-4195-aef1-97e4607b8046,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "afce301e-b341-464f-9f41-fad9a37c454d", "address": "fa:16:3e:86:1d:7d", "network": {"id": "9b411d83-42cd-41ac-bdaa-ce305ed6daf9", "bridge": "br-int", "label": "tempest-network-smoke--584081957", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafce301e-b3", "ovs_interfaceid": "afce301e-b341-464f-9f41-fad9a37c454d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.868 187518 DEBUG nova.network.os_vif_util [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "afce301e-b341-464f-9f41-fad9a37c454d", "address": "fa:16:3e:86:1d:7d", "network": {"id": "9b411d83-42cd-41ac-bdaa-ce305ed6daf9", "bridge": "br-int", "label": "tempest-network-smoke--584081957", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.30", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapafce301e-b3", "ovs_interfaceid": "afce301e-b341-464f-9f41-fad9a37c454d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.869 187518 DEBUG nova.network.os_vif_util [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:1d:7d,bridge_name='br-int',has_traffic_filtering=True,id=afce301e-b341-464f-9f41-fad9a37c454d,network=Network(9b411d83-42cd-41ac-bdaa-ce305ed6daf9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapafce301e-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.873 187518 DEBUG nova.virt.libvirt.guest [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] attach device xml: <interface type="ethernet">
Nov 28 19:56:01 np0005539279 nova_compute[187514]:  <mac address="fa:16:3e:86:1d:7d"/>
Nov 28 19:56:01 np0005539279 nova_compute[187514]:  <model type="virtio"/>
Nov 28 19:56:01 np0005539279 nova_compute[187514]:  <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:56:01 np0005539279 nova_compute[187514]:  <mtu size="1442"/>
Nov 28 19:56:01 np0005539279 nova_compute[187514]:  <target dev="tapafce301e-b3"/>
Nov 28 19:56:01 np0005539279 nova_compute[187514]: </interface>
Nov 28 19:56:01 np0005539279 nova_compute[187514]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 28 19:56:01 np0005539279 kernel: tapafce301e-b3: entered promiscuous mode
Nov 28 19:56:01 np0005539279 NetworkManager[55703]: <info>  [1764377761.8885] manager: (tapafce301e-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Nov 28 19:56:01 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:01Z|00053|binding|INFO|Claiming lport afce301e-b341-464f-9f41-fad9a37c454d for this chassis.
Nov 28 19:56:01 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:01Z|00054|binding|INFO|afce301e-b341-464f-9f41-fad9a37c454d: Claiming fa:16:3e:86:1d:7d 10.100.0.30
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.891 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.901 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:1d:7d 10.100.0.30'], port_security=['fa:16:3e:86:1d:7d 10.100.0.30'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.30/28', 'neutron:device_id': '9c7d413a-3895-4195-aef1-97e4607b8046', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b411d83-42cd-41ac-bdaa-ce305ed6daf9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9bd29a3-ae46-41d8-aaea-3325e1bc2031', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1c00096a-b041-450f-b522-cda439daf5e9, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=afce301e-b341-464f-9f41-fad9a37c454d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.903 104584 INFO neutron.agent.ovn.metadata.agent [-] Port afce301e-b341-464f-9f41-fad9a37c454d in datapath 9b411d83-42cd-41ac-bdaa-ce305ed6daf9 bound to our chassis#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.904 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b411d83-42cd-41ac-bdaa-ce305ed6daf9#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.923 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b1abd1-ef4d-49bd-afb0-14eebee20d86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.924 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b411d83-41 in ovnmeta-9b411d83-42cd-41ac-bdaa-ce305ed6daf9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.927 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b411d83-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.927 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e897b7cd-bb7a-4a34-b2ab-bfddf5d0c342]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.928 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b2126be0-17a2-4602-8b50-53ae545ad9c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:01 np0005539279 systemd-udevd[215294]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.939 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:01 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:01Z|00055|binding|INFO|Setting lport afce301e-b341-464f-9f41-fad9a37c454d ovn-installed in OVS
Nov 28 19:56:01 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:01Z|00056|binding|INFO|Setting lport afce301e-b341-464f-9f41-fad9a37c454d up in Southbound
Nov 28 19:56:01 np0005539279 nova_compute[187514]: 2025-11-29 00:56:01.944 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.945 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[c5826b09-483f-48a0-997c-7b5b4e16df8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:01 np0005539279 NetworkManager[55703]: <info>  [1764377761.9519] device (tapafce301e-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:56:01 np0005539279 NetworkManager[55703]: <info>  [1764377761.9545] device (tapafce301e-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:56:01 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:01.977 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[ae06a9b4-a811-410a-948e-a13e9dab24b2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:02 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:02.012 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[61e76b53-8a50-4737-94b5-de3e24c6ec23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:02 np0005539279 systemd-udevd[215297]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:56:02 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:02.021 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4f57eb-3ddc-4272-a72c-efe3870c2ba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:02 np0005539279 NetworkManager[55703]: <info>  [1764377762.0228] manager: (tap9b411d83-40): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Nov 28 19:56:02 np0005539279 nova_compute[187514]: 2025-11-29 00:56:02.027 187518 DEBUG nova.virt.libvirt.driver [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:56:02 np0005539279 nova_compute[187514]: 2025-11-29 00:56:02.028 187518 DEBUG nova.virt.libvirt.driver [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:56:02 np0005539279 nova_compute[187514]: 2025-11-29 00:56:02.030 187518 DEBUG nova.virt.libvirt.driver [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:d5:6e:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:56:02 np0005539279 nova_compute[187514]: 2025-11-29 00:56:02.030 187518 DEBUG nova.virt.libvirt.driver [None req-9db65215-4388-4204-91b2-e068ce1f31aa 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:86:1d:7d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:56:10 np0005539279 podman[215578]: 2025-11-29 00:56:10.850510756 +0000 UTC m=+0.093968147 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 19:56:11 np0005539279 rsyslogd[1006]: imjournal: 1096 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Nov 28 19:56:11 np0005539279 nova_compute[187514]: 2025-11-29 00:56:11.838 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:12 np0005539279 nova_compute[187514]: 2025-11-29 00:56:12.253 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:12 np0005539279 nova_compute[187514]: 2025-11-29 00:56:12.426 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:12 np0005539279 nova_compute[187514]: 2025-11-29 00:56:12.514 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:16 np0005539279 nova_compute[187514]: 2025-11-29 00:56:16.849 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:17 np0005539279 nova_compute[187514]: 2025-11-29 00:56:17.287 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:19 np0005539279 nova_compute[187514]: 2025-11-29 00:56:19.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:20 np0005539279 nova_compute[187514]: 2025-11-29 00:56:20.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:20 np0005539279 nova_compute[187514]: 2025-11-29 00:56:20.611 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:20 np0005539279 nova_compute[187514]: 2025-11-29 00:56:20.612 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:56:21 np0005539279 podman[215601]: 2025-11-29 00:56:21.764431108 +0000 UTC m=+0.070522730 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:56:21 np0005539279 podman[215600]: 2025-11-29 00:56:21.79346067 +0000 UTC m=+0.100040168 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 19:56:21 np0005539279 nova_compute[187514]: 2025-11-29 00:56:21.805 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764377766.8034542, 9c7d413a-3895-4195-aef1-97e4607b8046 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:56:21 np0005539279 nova_compute[187514]: 2025-11-29 00:56:21.806 187518 INFO nova.compute.manager [-] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] VM Stopped (Lifecycle Event)#033[00m
Nov 28 19:56:21 np0005539279 nova_compute[187514]: 2025-11-29 00:56:21.834 187518 DEBUG nova.compute.manager [None req-505a5756-8cc4-40d5-bef3-5287614d9d66 - - - - - -] [instance: 9c7d413a-3895-4195-aef1-97e4607b8046] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:56:21 np0005539279 nova_compute[187514]: 2025-11-29 00:56:21.861 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.289 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.632 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.633 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.669 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.670 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.670 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.671 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.972 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.976 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5788MB free_disk=73.34345626831055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.976 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:22 np0005539279 nova_compute[187514]: 2025-11-29 00:56:22.977 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:23 np0005539279 nova_compute[187514]: 2025-11-29 00:56:23.055 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:56:23 np0005539279 nova_compute[187514]: 2025-11-29 00:56:23.055 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:56:23 np0005539279 nova_compute[187514]: 2025-11-29 00:56:23.093 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:56:23 np0005539279 nova_compute[187514]: 2025-11-29 00:56:23.114 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:56:23 np0005539279 nova_compute[187514]: 2025-11-29 00:56:23.142 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:56:23 np0005539279 nova_compute[187514]: 2025-11-29 00:56:23.143 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:24 np0005539279 podman[215648]: 2025-11-29 00:56:24.861704529 +0000 UTC m=+0.093990058 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:56:24 np0005539279 podman[215647]: 2025-11-29 00:56:24.930565942 +0000 UTC m=+0.170049469 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 19:56:25 np0005539279 nova_compute[187514]: 2025-11-29 00:56:25.119 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:25 np0005539279 nova_compute[187514]: 2025-11-29 00:56:25.145 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:25 np0005539279 nova_compute[187514]: 2025-11-29 00:56:25.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:25 np0005539279 nova_compute[187514]: 2025-11-29 00:56:25.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:25 np0005539279 nova_compute[187514]: 2025-11-29 00:56:25.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:56:26 np0005539279 nova_compute[187514]: 2025-11-29 00:56:26.866 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.292 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.690 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.691 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.713 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.809 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.810 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.820 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.820 187518 INFO nova.compute.claims [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.968 187518 DEBUG nova.compute.provider_tree [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:56:27 np0005539279 nova_compute[187514]: 2025-11-29 00:56:27.991 187518 DEBUG nova.scheduler.client.report [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.019 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.209s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.020 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.067 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.067 187518 DEBUG nova.network.neutron [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.089 187518 INFO nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.107 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.215 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.217 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.217 187518 INFO nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Creating image(s)#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.218 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.219 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.220 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.243 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.331 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.337 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.338 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.360 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.387 187518 DEBUG nova.policy [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.443 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.444 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.498 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.500 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.501 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.594 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.596 187518 DEBUG nova.virt.disk.api [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.597 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.654 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.655 187518 DEBUG nova.virt.disk.api [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.655 187518 DEBUG nova.objects.instance [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.677 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.678 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Ensure instance console log exists: /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.678 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.678 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:28 np0005539279 nova_compute[187514]: 2025-11-29 00:56:28.679 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.088 187518 DEBUG nova.network.neutron [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Successfully created port: a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.776 187518 DEBUG nova.network.neutron [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Successfully updated port: a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.797 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.797 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.798 187518 DEBUG nova.network.neutron [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.873 187518 DEBUG nova.compute.manager [req-c9bbb1fb-86d9-4bbb-b11e-a6310dacff26 req-b39f1271-aa78-4d60-819a-71019aea9ccc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-changed-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.874 187518 DEBUG nova.compute.manager [req-c9bbb1fb-86d9-4bbb-b11e-a6310dacff26 req-b39f1271-aa78-4d60-819a-71019aea9ccc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Refreshing instance network info cache due to event network-changed-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.874 187518 DEBUG oslo_concurrency.lockutils [req-c9bbb1fb-86d9-4bbb-b11e-a6310dacff26 req-b39f1271-aa78-4d60-819a-71019aea9ccc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:56:29 np0005539279 nova_compute[187514]: 2025-11-29 00:56:29.971 187518 DEBUG nova.network.neutron [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 19:56:31 np0005539279 nova_compute[187514]: 2025-11-29 00:56:31.869 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.305 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.464 187518 DEBUG nova.network.neutron [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updating instance_info_cache with network_info: [{"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.490 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.491 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Instance network_info: |[{"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.491 187518 DEBUG oslo_concurrency.lockutils [req-c9bbb1fb-86d9-4bbb-b11e-a6310dacff26 req-b39f1271-aa78-4d60-819a-71019aea9ccc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.492 187518 DEBUG nova.network.neutron [req-c9bbb1fb-86d9-4bbb-b11e-a6310dacff26 req-b39f1271-aa78-4d60-819a-71019aea9ccc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Refreshing network info cache for port a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.497 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Start _get_guest_xml network_info=[{"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.505 187518 WARNING nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.520 187518 DEBUG nova.virt.libvirt.host [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.521 187518 DEBUG nova.virt.libvirt.host [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.525 187518 DEBUG nova.virt.libvirt.host [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.526 187518 DEBUG nova.virt.libvirt.host [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.527 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.528 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.529 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.529 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.530 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.530 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.530 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.531 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.531 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.532 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.532 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.532 187518 DEBUG nova.virt.hardware [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.539 187518 DEBUG nova.virt.libvirt.vif [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:56:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-302931196',display_name='tempest-TestNetworkBasicOps-server-302931196',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-302931196',id=4,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGVEwm853y2eS199MxO6IEnfj3smPW9ngtv23k+04V7lQDOEde/4DwChpU/dPMjVKi0udHxpzA16alvdHRWsfEQmqVLBInT31956bVheL4YLYKxXq/G18LlYLmDWEYX7Q==',key_name='tempest-TestNetworkBasicOps-734851356',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-70t6y47o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:56:28Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.540 187518 DEBUG nova.network.os_vif_util [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.541 187518 DEBUG nova.network.os_vif_util [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:7e:a1,bridge_name='br-int',has_traffic_filtering=True,id=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cbd84f-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.543 187518 DEBUG nova.objects.instance [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.566 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] End _get_guest_xml xml=<domain type="kvm">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <uuid>f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418</uuid>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <name>instance-00000004</name>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-302931196</nova:name>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 00:56:32</nova:creationTime>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        <nova:port uuid="a8cbd84f-18a7-4baf-9ce9-0617d15f9c10">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <system>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <entry name="serial">f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418</entry>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <entry name="uuid">f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418</entry>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </system>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <os>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  </clock>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.config"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:56:7e:a1"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <target dev="tapa8cbd84f-18"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/console.log" append="off"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </serial>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <video>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 19:56:32 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 19:56:32 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:56:32 np0005539279 nova_compute[187514]: </domain>
Nov 28 19:56:32 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.567 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Preparing to wait for external event network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.568 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.569 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.569 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.571 187518 DEBUG nova.virt.libvirt.vif [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:56:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-302931196',display_name='tempest-TestNetworkBasicOps-server-302931196',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-302931196',id=4,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGVEwm853y2eS199MxO6IEnfj3smPW9ngtv23k+04V7lQDOEde/4DwChpU/dPMjVKi0udHxpzA16alvdHRWsfEQmqVLBInT31956bVheL4YLYKxXq/G18LlYLmDWEYX7Q==',key_name='tempest-TestNetworkBasicOps-734851356',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-70t6y47o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:56:28Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.572 187518 DEBUG nova.network.os_vif_util [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.573 187518 DEBUG nova.network.os_vif_util [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:56:7e:a1,bridge_name='br-int',has_traffic_filtering=True,id=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cbd84f-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.574 187518 DEBUG os_vif [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:7e:a1,bridge_name='br-int',has_traffic_filtering=True,id=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cbd84f-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.575 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.576 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.577 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.582 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.582 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8cbd84f-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.584 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa8cbd84f-18, col_values=(('external_ids', {'iface-id': 'a8cbd84f-18a7-4baf-9ce9-0617d15f9c10', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:56:7e:a1', 'vm-uuid': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:32 np0005539279 NetworkManager[55703]: <info>  [1764377792.5880] manager: (tapa8cbd84f-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.589 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.597 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.598 187518 INFO os_vif [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:56:7e:a1,bridge_name='br-int',has_traffic_filtering=True,id=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cbd84f-18')#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.671 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.672 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.672 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:56:7e:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:56:32 np0005539279 nova_compute[187514]: 2025-11-29 00:56:32.674 187518 INFO nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Using config drive#033[00m
Nov 28 19:56:33 np0005539279 nova_compute[187514]: 2025-11-29 00:56:33.508 187518 INFO nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Creating config drive at /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.config#033[00m
Nov 28 19:56:33 np0005539279 nova_compute[187514]: 2025-11-29 00:56:33.517 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9cyi3t5z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:56:33 np0005539279 nova_compute[187514]: 2025-11-29 00:56:33.658 187518 DEBUG oslo_concurrency.processutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9cyi3t5z" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:56:33 np0005539279 kernel: tapa8cbd84f-18: entered promiscuous mode
Nov 28 19:56:33 np0005539279 NetworkManager[55703]: <info>  [1764377793.7492] manager: (tapa8cbd84f-18): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Nov 28 19:56:33 np0005539279 nova_compute[187514]: 2025-11-29 00:56:33.751 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:33Z|00074|binding|INFO|Claiming lport a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 for this chassis.
Nov 28 19:56:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:33Z|00075|binding|INFO|a8cbd84f-18a7-4baf-9ce9-0617d15f9c10: Claiming fa:16:3e:56:7e:a1 10.100.0.3
Nov 28 19:56:33 np0005539279 nova_compute[187514]: 2025-11-29 00:56:33.760 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:33 np0005539279 nova_compute[187514]: 2025-11-29 00:56:33.765 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.777 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:7e:a1 10.100.0.3'], port_security=['fa:16:3e:56:7e:a1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3464627b-636f-42dd-ae8e-b4b260cea225', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2f94587-1ade-4577-8fa4-d6db6a73fd0b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aaff49b6-7ffb-4dbf-949e-3d42bb7e7357, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.779 104584 INFO neutron.agent.ovn.metadata.agent [-] Port a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 in datapath 3464627b-636f-42dd-ae8e-b4b260cea225 bound to our chassis#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.781 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3464627b-636f-42dd-ae8e-b4b260cea225#033[00m
Nov 28 19:56:33 np0005539279 systemd-udevd[215733]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.799 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[53b56b4e-6c69-455f-92e5-51196af226aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.800 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3464627b-61 in ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.803 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3464627b-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.803 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[5273aa4b-1477-4255-a711-1ffc24bf4a45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.804 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e3dcc6f1-2dc6-41b3-93b6-5b08856b8b09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:33 np0005539279 systemd-machined[153752]: New machine qemu-4-instance-00000004.
Nov 28 19:56:33 np0005539279 NetworkManager[55703]: <info>  [1764377793.8180] device (tapa8cbd84f-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:56:33 np0005539279 NetworkManager[55703]: <info>  [1764377793.8204] device (tapa8cbd84f-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.822 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[839d5f17-53f3-4b73-ba4a-ff4156878c6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:33 np0005539279 nova_compute[187514]: 2025-11-29 00:56:33.846 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:33Z|00076|binding|INFO|Setting lport a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 ovn-installed in OVS
Nov 28 19:56:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:33Z|00077|binding|INFO|Setting lport a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 up in Southbound
Nov 28 19:56:33 np0005539279 nova_compute[187514]: 2025-11-29 00:56:33.850 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:33 np0005539279 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.855 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[a05b4f09-8f99-4f34-b90c-6bc378471c6e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.898 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[30609324-9e59-45f0-95f9-00cd43bdb5ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.907 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[470957a8-28ab-4563-8ccc-09e50ac6b16c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:33 np0005539279 NetworkManager[55703]: <info>  [1764377793.9084] manager: (tap3464627b-60): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.954 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[d246892d-5146-4e00-bd1c-bf9c6374c0bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:33.960 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[da7c8c6c-4428-4eee-9c54-87820ce17320]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:34 np0005539279 NetworkManager[55703]: <info>  [1764377794.0021] device (tap3464627b-60): carrier: link connected
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.010 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[8be550c8-f61f-4fa3-96f3-7b8035a7ab14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.037 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[7c16fb90-014d-46d0-bee5-1a17a56496cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3464627b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:b4:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372770, 'reachable_time': 19228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215766, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.061 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad5cf32-6722-4b30-a347-2d7812324a70]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:b466'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372770, 'tstamp': 372770}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215767, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.087 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[656fd59e-c37a-4176-b4d1-f14be3ed2a8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3464627b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:b4:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372770, 'reachable_time': 19228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215768, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.137 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[6871df78-cc88-46fe-9119-dd207f663aec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.236 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[7a5a4c20-15c4-4133-858c-c9f18f596e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.238 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3464627b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.239 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.240 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3464627b-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:34 np0005539279 NetworkManager[55703]: <info>  [1764377794.2437] manager: (tap3464627b-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 28 19:56:34 np0005539279 kernel: tap3464627b-60: entered promiscuous mode
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.242 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.246 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.247 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3464627b-60, col_values=(('external_ids', {'iface-id': '2e2525c3-99d8-49dc-9041-3095814f1167'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.248 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:34 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:34Z|00078|binding|INFO|Releasing lport 2e2525c3-99d8-49dc-9041-3095814f1167 from this chassis (sb_readonly=0)
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.269 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377794.2685359, f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.269 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] VM Started (Lifecycle Event)#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.276 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.278 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3464627b-636f-42dd-ae8e-b4b260cea225.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3464627b-636f-42dd-ae8e-b4b260cea225.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.279 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c836f8b2-9063-49c7-a10b-991efd3debf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.280 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-3464627b-636f-42dd-ae8e-b4b260cea225
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/3464627b-636f-42dd-ae8e-b4b260cea225.pid.haproxy
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 3464627b-636f-42dd-ae8e-b4b260cea225
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 19:56:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:34.281 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'env', 'PROCESS_TAG=haproxy-3464627b-636f-42dd-ae8e-b4b260cea225', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3464627b-636f-42dd-ae8e-b4b260cea225.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.293 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.299 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377794.2691264, f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.300 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] VM Paused (Lifecycle Event)#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.333 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.338 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.369 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.607 187518 DEBUG nova.compute.manager [req-9a898b68-6fba-4061-99f9-a1e5683e5e70 req-82b3f120-f083-47bc-9684-dd3f59653483 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.607 187518 DEBUG oslo_concurrency.lockutils [req-9a898b68-6fba-4061-99f9-a1e5683e5e70 req-82b3f120-f083-47bc-9684-dd3f59653483 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.607 187518 DEBUG oslo_concurrency.lockutils [req-9a898b68-6fba-4061-99f9-a1e5683e5e70 req-82b3f120-f083-47bc-9684-dd3f59653483 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.608 187518 DEBUG oslo_concurrency.lockutils [req-9a898b68-6fba-4061-99f9-a1e5683e5e70 req-82b3f120-f083-47bc-9684-dd3f59653483 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.608 187518 DEBUG nova.compute.manager [req-9a898b68-6fba-4061-99f9-a1e5683e5e70 req-82b3f120-f083-47bc-9684-dd3f59653483 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Processing event network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.608 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.611 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377794.611298, f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.612 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] VM Resumed (Lifecycle Event)#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.613 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.616 187518 INFO nova.virt.libvirt.driver [-] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Instance spawned successfully.#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.616 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 19:56:34 np0005539279 podman[215807]: 2025-11-29 00:56:34.630554261 +0000 UTC m=+0.051352373 container create aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.632 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.635 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.635 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.635 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.636 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.636 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.636 187518 DEBUG nova.virt.libvirt.driver [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.639 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:56:34 np0005539279 systemd[1]: Started libpod-conmon-aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f.scope.
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.663 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:56:34 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.695 187518 INFO nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Took 6.48 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.695 187518 DEBUG nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:56:34 np0005539279 podman[215807]: 2025-11-29 00:56:34.601396455 +0000 UTC m=+0.022194597 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:56:34 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/833d507e5fada38f13b7042c49a427b0a536e27362584a7072f886b0fa7b58df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 19:56:34 np0005539279 podman[215807]: 2025-11-29 00:56:34.708551108 +0000 UTC m=+0.129349240 container init aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 19:56:34 np0005539279 podman[215807]: 2025-11-29 00:56:34.716567173 +0000 UTC m=+0.137365285 container start aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:56:34 np0005539279 neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225[215821]: [NOTICE]   (215827) : New worker (215829) forked
Nov 28 19:56:34 np0005539279 neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225[215821]: [NOTICE]   (215827) : Loading success.
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.747 187518 INFO nova.compute.manager [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Took 6.97 seconds to build instance.#033[00m
Nov 28 19:56:34 np0005539279 nova_compute[187514]: 2025-11-29 00:56:34.760 187518 DEBUG oslo_concurrency.lockutils [None req-4ef3dff3-17a3-4b5e-93ba-6988cc3ff50b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:35 np0005539279 nova_compute[187514]: 2025-11-29 00:56:35.621 187518 DEBUG nova.network.neutron [req-c9bbb1fb-86d9-4bbb-b11e-a6310dacff26 req-b39f1271-aa78-4d60-819a-71019aea9ccc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updated VIF entry in instance network info cache for port a8cbd84f-18a7-4baf-9ce9-0617d15f9c10. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:56:35 np0005539279 nova_compute[187514]: 2025-11-29 00:56:35.621 187518 DEBUG nova.network.neutron [req-c9bbb1fb-86d9-4bbb-b11e-a6310dacff26 req-b39f1271-aa78-4d60-819a-71019aea9ccc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updating instance_info_cache with network_info: [{"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:56:35 np0005539279 nova_compute[187514]: 2025-11-29 00:56:35.637 187518 DEBUG oslo_concurrency.lockutils [req-c9bbb1fb-86d9-4bbb-b11e-a6310dacff26 req-b39f1271-aa78-4d60-819a-71019aea9ccc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:56:36 np0005539279 nova_compute[187514]: 2025-11-29 00:56:36.709 187518 DEBUG nova.compute.manager [req-3b96b2a1-e424-49fa-9642-b5943a686c2d req-f69e8423-440e-4fce-9612-36b90efcb330 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:56:36 np0005539279 nova_compute[187514]: 2025-11-29 00:56:36.710 187518 DEBUG oslo_concurrency.lockutils [req-3b96b2a1-e424-49fa-9642-b5943a686c2d req-f69e8423-440e-4fce-9612-36b90efcb330 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:56:36 np0005539279 nova_compute[187514]: 2025-11-29 00:56:36.710 187518 DEBUG oslo_concurrency.lockutils [req-3b96b2a1-e424-49fa-9642-b5943a686c2d req-f69e8423-440e-4fce-9612-36b90efcb330 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:56:36 np0005539279 nova_compute[187514]: 2025-11-29 00:56:36.711 187518 DEBUG oslo_concurrency.lockutils [req-3b96b2a1-e424-49fa-9642-b5943a686c2d req-f69e8423-440e-4fce-9612-36b90efcb330 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:56:36 np0005539279 nova_compute[187514]: 2025-11-29 00:56:36.711 187518 DEBUG nova.compute.manager [req-3b96b2a1-e424-49fa-9642-b5943a686c2d req-f69e8423-440e-4fce-9612-36b90efcb330 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] No waiting events found dispatching network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:56:36 np0005539279 nova_compute[187514]: 2025-11-29 00:56:36.711 187518 WARNING nova.compute.manager [req-3b96b2a1-e424-49fa-9642-b5943a686c2d req-f69e8423-440e-4fce-9612-36b90efcb330 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received unexpected event network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:56:37 np0005539279 nova_compute[187514]: 2025-11-29 00:56:37.308 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:37 np0005539279 nova_compute[187514]: 2025-11-29 00:56:37.587 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:38.346 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:56:38 np0005539279 nova_compute[187514]: 2025-11-29 00:56:38.346 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:38.348 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 19:56:38 np0005539279 nova_compute[187514]: 2025-11-29 00:56:38.796 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:38 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:38Z|00079|binding|INFO|Releasing lport 2e2525c3-99d8-49dc-9041-3095814f1167 from this chassis (sb_readonly=0)
Nov 28 19:56:38 np0005539279 NetworkManager[55703]: <info>  [1764377798.7979] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 28 19:56:38 np0005539279 NetworkManager[55703]: <info>  [1764377798.7995] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 28 19:56:38 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:38Z|00080|binding|INFO|Releasing lport 2e2525c3-99d8-49dc-9041-3095814f1167 from this chassis (sb_readonly=0)
Nov 28 19:56:38 np0005539279 nova_compute[187514]: 2025-11-29 00:56:38.854 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:38 np0005539279 nova_compute[187514]: 2025-11-29 00:56:38.861 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:39 np0005539279 nova_compute[187514]: 2025-11-29 00:56:39.759 187518 DEBUG nova.compute.manager [req-b27d0c2e-0d85-4901-9347-7ec89eeb4030 req-ba02f203-6d1c-4df0-8260-e023499fd27c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-changed-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:56:39 np0005539279 nova_compute[187514]: 2025-11-29 00:56:39.760 187518 DEBUG nova.compute.manager [req-b27d0c2e-0d85-4901-9347-7ec89eeb4030 req-ba02f203-6d1c-4df0-8260-e023499fd27c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Refreshing instance network info cache due to event network-changed-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:56:39 np0005539279 nova_compute[187514]: 2025-11-29 00:56:39.760 187518 DEBUG oslo_concurrency.lockutils [req-b27d0c2e-0d85-4901-9347-7ec89eeb4030 req-ba02f203-6d1c-4df0-8260-e023499fd27c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:56:39 np0005539279 nova_compute[187514]: 2025-11-29 00:56:39.761 187518 DEBUG oslo_concurrency.lockutils [req-b27d0c2e-0d85-4901-9347-7ec89eeb4030 req-ba02f203-6d1c-4df0-8260-e023499fd27c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:56:39 np0005539279 nova_compute[187514]: 2025-11-29 00:56:39.761 187518 DEBUG nova.network.neutron [req-b27d0c2e-0d85-4901-9347-7ec89eeb4030 req-ba02f203-6d1c-4df0-8260-e023499fd27c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Refreshing network info cache for port a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:56:39 np0005539279 podman[215841]: 2025-11-29 00:56:39.845575439 +0000 UTC m=+0.076225144 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 19:56:39 np0005539279 podman[215839]: 2025-11-29 00:56:39.873581933 +0000 UTC m=+0.103140336 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 19:56:41 np0005539279 nova_compute[187514]: 2025-11-29 00:56:41.685 187518 DEBUG nova.network.neutron [req-b27d0c2e-0d85-4901-9347-7ec89eeb4030 req-ba02f203-6d1c-4df0-8260-e023499fd27c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updated VIF entry in instance network info cache for port a8cbd84f-18a7-4baf-9ce9-0617d15f9c10. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:56:41 np0005539279 nova_compute[187514]: 2025-11-29 00:56:41.686 187518 DEBUG nova.network.neutron [req-b27d0c2e-0d85-4901-9347-7ec89eeb4030 req-ba02f203-6d1c-4df0-8260-e023499fd27c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updating instance_info_cache with network_info: [{"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:56:41 np0005539279 podman[215886]: 2025-11-29 00:56:41.867190086 +0000 UTC m=+0.107077072 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:56:41 np0005539279 nova_compute[187514]: 2025-11-29 00:56:41.897 187518 DEBUG oslo_concurrency.lockutils [req-b27d0c2e-0d85-4901-9347-7ec89eeb4030 req-ba02f203-6d1c-4df0-8260-e023499fd27c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:56:42 np0005539279 nova_compute[187514]: 2025-11-29 00:56:42.311 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:42 np0005539279 nova_compute[187514]: 2025-11-29 00:56:42.590 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:47 np0005539279 nova_compute[187514]: 2025-11-29 00:56:47.312 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:47 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:47Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:56:7e:a1 10.100.0.3
Nov 28 19:56:47 np0005539279 ovn_controller[95686]: 2025-11-29T00:56:47Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:56:7e:a1 10.100.0.3
Nov 28 19:56:47 np0005539279 nova_compute[187514]: 2025-11-29 00:56:47.593 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:56:48.350 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:56:52 np0005539279 nova_compute[187514]: 2025-11-29 00:56:52.355 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:52 np0005539279 nova_compute[187514]: 2025-11-29 00:56:52.595 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:52 np0005539279 podman[215925]: 2025-11-29 00:56:52.859077258 +0000 UTC m=+0.095656746 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Nov 28 19:56:52 np0005539279 podman[215926]: 2025-11-29 00:56:52.880813368 +0000 UTC m=+0.112551533 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:56:54 np0005539279 nova_compute[187514]: 2025-11-29 00:56:54.360 187518 INFO nova.compute.manager [None req-c7ee5cda-5574-4c5a-8f7c-cd97fcbcc350 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Get console output#033[00m
Nov 28 19:56:54 np0005539279 nova_compute[187514]: 2025-11-29 00:56:54.368 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 19:56:55 np0005539279 podman[215973]: 2025-11-29 00:56:55.849095112 +0000 UTC m=+0.080738157 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:56:55 np0005539279 podman[215972]: 2025-11-29 00:56:55.909834209 +0000 UTC m=+0.146716818 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:56:56 np0005539279 nova_compute[187514]: 2025-11-29 00:56:56.124 187518 DEBUG nova.compute.manager [req-9b0fdb74-852c-4379-acd2-9446cd264967 req-7a68287f-fea5-4861-adeb-4a606aba54bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-changed-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:56:56 np0005539279 nova_compute[187514]: 2025-11-29 00:56:56.124 187518 DEBUG nova.compute.manager [req-9b0fdb74-852c-4379-acd2-9446cd264967 req-7a68287f-fea5-4861-adeb-4a606aba54bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Refreshing instance network info cache due to event network-changed-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:56:56 np0005539279 nova_compute[187514]: 2025-11-29 00:56:56.124 187518 DEBUG oslo_concurrency.lockutils [req-9b0fdb74-852c-4379-acd2-9446cd264967 req-7a68287f-fea5-4861-adeb-4a606aba54bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:56:56 np0005539279 nova_compute[187514]: 2025-11-29 00:56:56.125 187518 DEBUG oslo_concurrency.lockutils [req-9b0fdb74-852c-4379-acd2-9446cd264967 req-7a68287f-fea5-4861-adeb-4a606aba54bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:56:56 np0005539279 nova_compute[187514]: 2025-11-29 00:56:56.125 187518 DEBUG nova.network.neutron [req-9b0fdb74-852c-4379-acd2-9446cd264967 req-7a68287f-fea5-4861-adeb-4a606aba54bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Refreshing network info cache for port a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:56:57 np0005539279 nova_compute[187514]: 2025-11-29 00:56:57.330 187518 DEBUG nova.network.neutron [req-9b0fdb74-852c-4379-acd2-9446cd264967 req-7a68287f-fea5-4861-adeb-4a606aba54bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updated VIF entry in instance network info cache for port a8cbd84f-18a7-4baf-9ce9-0617d15f9c10. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:56:57 np0005539279 nova_compute[187514]: 2025-11-29 00:56:57.330 187518 DEBUG nova.network.neutron [req-9b0fdb74-852c-4379-acd2-9446cd264967 req-7a68287f-fea5-4861-adeb-4a606aba54bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updating instance_info_cache with network_info: [{"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:56:57 np0005539279 nova_compute[187514]: 2025-11-29 00:56:57.348 187518 DEBUG oslo_concurrency.lockutils [req-9b0fdb74-852c-4379-acd2-9446cd264967 req-7a68287f-fea5-4861-adeb-4a606aba54bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:56:57 np0005539279 nova_compute[187514]: 2025-11-29 00:56:57.357 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:56:57 np0005539279 nova_compute[187514]: 2025-11-29 00:56:57.598 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:02 np0005539279 nova_compute[187514]: 2025-11-29 00:57:02.360 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:02 np0005539279 nova_compute[187514]: 2025-11-29 00:57:02.600 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:02 np0005539279 nova_compute[187514]: 2025-11-29 00:57:02.861 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "7e8ca829-ad9a-476a-afe3-92e4f655c723" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:02 np0005539279 nova_compute[187514]: 2025-11-29 00:57:02.862 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:02 np0005539279 nova_compute[187514]: 2025-11-29 00:57:02.881 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 19:57:02 np0005539279 nova_compute[187514]: 2025-11-29 00:57:02.987 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:02 np0005539279 nova_compute[187514]: 2025-11-29 00:57:02.988 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:02.999 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.000 187518 INFO nova.compute.claims [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.181 187518 DEBUG nova.compute.provider_tree [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.203 187518 DEBUG nova.scheduler.client.report [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.235 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.236 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.311 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.312 187518 DEBUG nova.network.neutron [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.340 187518 INFO nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.366 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.496 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.499 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.499 187518 INFO nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Creating image(s)#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.500 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.501 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.502 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.528 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.628 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.630 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.631 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.655 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.744 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.746 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.802 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.804 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.805 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.894 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.897 187518 DEBUG nova.virt.disk.api [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.898 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.987 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.989 187518 DEBUG nova.virt.disk.api [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 19:57:03 np0005539279 nova_compute[187514]: 2025-11-29 00:57:03.990 187518 DEBUG nova.objects.instance [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 7e8ca829-ad9a-476a-afe3-92e4f655c723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:57:04 np0005539279 nova_compute[187514]: 2025-11-29 00:57:04.015 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 19:57:04 np0005539279 nova_compute[187514]: 2025-11-29 00:57:04.016 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Ensure instance console log exists: /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 19:57:04 np0005539279 nova_compute[187514]: 2025-11-29 00:57:04.017 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:04 np0005539279 nova_compute[187514]: 2025-11-29 00:57:04.018 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:04 np0005539279 nova_compute[187514]: 2025-11-29 00:57:04.018 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:04 np0005539279 nova_compute[187514]: 2025-11-29 00:57:04.374 187518 DEBUG nova.policy [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:57:07 np0005539279 nova_compute[187514]: 2025-11-29 00:57:07.362 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:07 np0005539279 nova_compute[187514]: 2025-11-29 00:57:07.466 187518 DEBUG nova.network.neutron [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Successfully created port: f6ce3521-17c6-45ca-bf6e-55c091ec29c7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:57:07 np0005539279 nova_compute[187514]: 2025-11-29 00:57:07.603 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:08.090 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:08.091 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:08.092 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:09 np0005539279 nova_compute[187514]: 2025-11-29 00:57:09.452 187518 DEBUG nova.network.neutron [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Successfully updated port: f6ce3521-17c6-45ca-bf6e-55c091ec29c7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:57:09 np0005539279 nova_compute[187514]: 2025-11-29 00:57:09.471 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:57:09 np0005539279 nova_compute[187514]: 2025-11-29 00:57:09.472 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:57:09 np0005539279 nova_compute[187514]: 2025-11-29 00:57:09.472 187518 DEBUG nova.network.neutron [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:57:09 np0005539279 nova_compute[187514]: 2025-11-29 00:57:09.550 187518 DEBUG nova.compute.manager [req-2bc3103a-ad30-41e5-b65c-f08f23d8f050 req-4b34ec2a-1182-41b5-a672-817bf34e3abe 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received event network-changed-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:09 np0005539279 nova_compute[187514]: 2025-11-29 00:57:09.550 187518 DEBUG nova.compute.manager [req-2bc3103a-ad30-41e5-b65c-f08f23d8f050 req-4b34ec2a-1182-41b5-a672-817bf34e3abe 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Refreshing instance network info cache due to event network-changed-f6ce3521-17c6-45ca-bf6e-55c091ec29c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:57:09 np0005539279 nova_compute[187514]: 2025-11-29 00:57:09.551 187518 DEBUG oslo_concurrency.lockutils [req-2bc3103a-ad30-41e5-b65c-f08f23d8f050 req-4b34ec2a-1182-41b5-a672-817bf34e3abe 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:57:09 np0005539279 nova_compute[187514]: 2025-11-29 00:57:09.635 187518 DEBUG nova.network.neutron [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 19:57:10 np0005539279 podman[216037]: 2025-11-29 00:57:10.849655882 +0000 UTC m=+0.084299712 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:57:10 np0005539279 podman[216038]: 2025-11-29 00:57:10.863071807 +0000 UTC m=+0.090750198 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.736 187518 DEBUG nova.network.neutron [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Updating instance_info_cache with network_info: [{"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.771 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.772 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Instance network_info: |[{"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.772 187518 DEBUG oslo_concurrency.lockutils [req-2bc3103a-ad30-41e5-b65c-f08f23d8f050 req-4b34ec2a-1182-41b5-a672-817bf34e3abe 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.772 187518 DEBUG nova.network.neutron [req-2bc3103a-ad30-41e5-b65c-f08f23d8f050 req-4b34ec2a-1182-41b5-a672-817bf34e3abe 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Refreshing network info cache for port f6ce3521-17c6-45ca-bf6e-55c091ec29c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.776 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Start _get_guest_xml network_info=[{"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.782 187518 WARNING nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.787 187518 DEBUG nova.virt.libvirt.host [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.787 187518 DEBUG nova.virt.libvirt.host [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.791 187518 DEBUG nova.virt.libvirt.host [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.791 187518 DEBUG nova.virt.libvirt.host [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.792 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.792 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.793 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.793 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.794 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.794 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.794 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.794 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.795 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.795 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.795 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.796 187518 DEBUG nova.virt.hardware [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.801 187518 DEBUG nova.virt.libvirt.vif [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1750823135',display_name='tempest-TestNetworkBasicOps-server-1750823135',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1750823135',id=5,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCP43JqvRGAxWdvk3K3a0G6FyqPzv+FFHCpIJnBjZ0AHMx6QrOf6TDyM8pvxhfQgXbL5drg0ciJrBrarxp8AXjh0nfAH+ZBgtvaGK28fLVWywfb9yZbA262SAEx+aVtg7g==',key_name='tempest-TestNetworkBasicOps-1823002382',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-fpl15jni',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:57:03Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=7e8ca829-ad9a-476a-afe3-92e4f655c723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.802 187518 DEBUG nova.network.os_vif_util [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.803 187518 DEBUG nova.network.os_vif_util [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:d1:7f,bridge_name='br-int',has_traffic_filtering=True,id=f6ce3521-17c6-45ca-bf6e-55c091ec29c7,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6ce3521-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.805 187518 DEBUG nova.objects.instance [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e8ca829-ad9a-476a-afe3-92e4f655c723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.827 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] End _get_guest_xml xml=<domain type="kvm">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <uuid>7e8ca829-ad9a-476a-afe3-92e4f655c723</uuid>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <name>instance-00000005</name>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-1750823135</nova:name>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 00:57:11</nova:creationTime>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        <nova:port uuid="f6ce3521-17c6-45ca-bf6e-55c091ec29c7">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <system>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <entry name="serial">7e8ca829-ad9a-476a-afe3-92e4f655c723</entry>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <entry name="uuid">7e8ca829-ad9a-476a-afe3-92e4f655c723</entry>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </system>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <os>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  </clock>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk.config"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:11:d1:7f"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <target dev="tapf6ce3521-17"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/console.log" append="off"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </serial>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <video>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 19:57:11 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 19:57:11 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:57:11 np0005539279 nova_compute[187514]: </domain>
Nov 28 19:57:11 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.827 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Preparing to wait for external event network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.828 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.828 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.828 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.829 187518 DEBUG nova.virt.libvirt.vif [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1750823135',display_name='tempest-TestNetworkBasicOps-server-1750823135',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1750823135',id=5,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCP43JqvRGAxWdvk3K3a0G6FyqPzv+FFHCpIJnBjZ0AHMx6QrOf6TDyM8pvxhfQgXbL5drg0ciJrBrarxp8AXjh0nfAH+ZBgtvaGK28fLVWywfb9yZbA262SAEx+aVtg7g==',key_name='tempest-TestNetworkBasicOps-1823002382',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-fpl15jni',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:57:03Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=7e8ca829-ad9a-476a-afe3-92e4f655c723,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.830 187518 DEBUG nova.network.os_vif_util [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.831 187518 DEBUG nova.network.os_vif_util [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:d1:7f,bridge_name='br-int',has_traffic_filtering=True,id=f6ce3521-17c6-45ca-bf6e-55c091ec29c7,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6ce3521-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.831 187518 DEBUG os_vif [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:d1:7f,bridge_name='br-int',has_traffic_filtering=True,id=f6ce3521-17c6-45ca-bf6e-55c091ec29c7,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6ce3521-17') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.832 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.832 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.833 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.835 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.835 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6ce3521-17, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.836 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6ce3521-17, col_values=(('external_ids', {'iface-id': 'f6ce3521-17c6-45ca-bf6e-55c091ec29c7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:d1:7f', 'vm-uuid': '7e8ca829-ad9a-476a-afe3-92e4f655c723'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.838 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:11 np0005539279 NetworkManager[55703]: <info>  [1764377831.8398] manager: (tapf6ce3521-17): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.843 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.847 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.849 187518 INFO os_vif [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:d1:7f,bridge_name='br-int',has_traffic_filtering=True,id=f6ce3521-17c6-45ca-bf6e-55c091ec29c7,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6ce3521-17')#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.920 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.921 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.922 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:11:d1:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:57:11 np0005539279 nova_compute[187514]: 2025-11-29 00:57:11.922 187518 INFO nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Using config drive#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.249 187518 INFO nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Creating config drive at /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk.config#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.258 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp19hl2_jx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.364 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.399 187518 DEBUG oslo_concurrency.processutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp19hl2_jx" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:12 np0005539279 kernel: tapf6ce3521-17: entered promiscuous mode
Nov 28 19:57:12 np0005539279 NetworkManager[55703]: <info>  [1764377832.4923] manager: (tapf6ce3521-17): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Nov 28 19:57:12 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:12Z|00081|binding|INFO|Claiming lport f6ce3521-17c6-45ca-bf6e-55c091ec29c7 for this chassis.
Nov 28 19:57:12 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:12Z|00082|binding|INFO|f6ce3521-17c6-45ca-bf6e-55c091ec29c7: Claiming fa:16:3e:11:d1:7f 10.100.0.5
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.501 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.511 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:d1:7f 10.100.0.5'], port_security=['fa:16:3e:11:d1:7f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7e8ca829-ad9a-476a-afe3-92e4f655c723', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3464627b-636f-42dd-ae8e-b4b260cea225', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02a5c569-2c63-4158-9f46-b0626b15a7d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aaff49b6-7ffb-4dbf-949e-3d42bb7e7357, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=f6ce3521-17c6-45ca-bf6e-55c091ec29c7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.513 104584 INFO neutron.agent.ovn.metadata.agent [-] Port f6ce3521-17c6-45ca-bf6e-55c091ec29c7 in datapath 3464627b-636f-42dd-ae8e-b4b260cea225 bound to our chassis#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.515 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3464627b-636f-42dd-ae8e-b4b260cea225#033[00m
Nov 28 19:57:12 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:12Z|00083|binding|INFO|Setting lport f6ce3521-17c6-45ca-bf6e-55c091ec29c7 ovn-installed in OVS
Nov 28 19:57:12 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:12Z|00084|binding|INFO|Setting lport f6ce3521-17c6-45ca-bf6e-55c091ec29c7 up in Southbound
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.530 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.538 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.537 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[926f6464-96b0-43d4-bc02-c01403274ff4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:12 np0005539279 systemd-udevd[216115]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:57:12 np0005539279 systemd-machined[153752]: New machine qemu-5-instance-00000005.
Nov 28 19:57:12 np0005539279 NetworkManager[55703]: <info>  [1764377832.5676] device (tapf6ce3521-17): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:57:12 np0005539279 NetworkManager[55703]: <info>  [1764377832.5693] device (tapf6ce3521-17): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:57:12 np0005539279 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.582 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[4981a091-da2d-4eee-a691-d5119f36b188]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.586 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6b70ef-1b73-4fa7-9452-7d9203ac3619]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:12 np0005539279 podman[216095]: 2025-11-29 00:57:12.604357696 +0000 UTC m=+0.115347727 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute)
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.617 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[44d94b60-47d9-454d-ac60-e38c931da388]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.635 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d8043c-7b2a-42ad-a535-21eb6c9acd2b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3464627b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:b4:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372770, 'reachable_time': 19228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216132, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.654 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[7dcb8fcc-fb62-4fbb-94d3-e3be932c9e25]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3464627b-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372789, 'tstamp': 372789}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216136, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3464627b-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372793, 'tstamp': 372793}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216136, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.655 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3464627b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.657 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.658 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.659 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3464627b-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.659 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.659 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3464627b-60, col_values=(('external_ids', {'iface-id': '2e2525c3-99d8-49dc-9041-3095814f1167'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:12.660 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.776 187518 DEBUG nova.compute.manager [req-5638b76c-57f6-487c-ad96-247c72734768 req-e7a8c4f5-6ff7-4f9a-a997-2617a2701734 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received event network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.776 187518 DEBUG oslo_concurrency.lockutils [req-5638b76c-57f6-487c-ad96-247c72734768 req-e7a8c4f5-6ff7-4f9a-a997-2617a2701734 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.782 187518 DEBUG oslo_concurrency.lockutils [req-5638b76c-57f6-487c-ad96-247c72734768 req-e7a8c4f5-6ff7-4f9a-a997-2617a2701734 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.782 187518 DEBUG oslo_concurrency.lockutils [req-5638b76c-57f6-487c-ad96-247c72734768 req-e7a8c4f5-6ff7-4f9a-a997-2617a2701734 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:12 np0005539279 nova_compute[187514]: 2025-11-29 00:57:12.783 187518 DEBUG nova.compute.manager [req-5638b76c-57f6-487c-ad96-247c72734768 req-e7a8c4f5-6ff7-4f9a-a997-2617a2701734 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Processing event network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.083 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377833.0831861, 7e8ca829-ad9a-476a-afe3-92e4f655c723 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.084 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] VM Started (Lifecycle Event)#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.086 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.091 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.098 187518 INFO nova.virt.libvirt.driver [-] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Instance spawned successfully.#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.099 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.121 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.131 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.139 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.140 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.141 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.142 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.143 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.143 187518 DEBUG nova.virt.libvirt.driver [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.157 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.158 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377833.083376, 7e8ca829-ad9a-476a-afe3-92e4f655c723 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.158 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] VM Paused (Lifecycle Event)#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.197 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.201 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377833.0903244, 7e8ca829-ad9a-476a-afe3-92e4f655c723 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.202 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] VM Resumed (Lifecycle Event)#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.212 187518 INFO nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Took 9.71 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.212 187518 DEBUG nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.223 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.227 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.258 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.286 187518 INFO nova.compute.manager [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Took 10.35 seconds to build instance.#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.321 187518 DEBUG oslo_concurrency.lockutils [None req-2cf5eefa-1ffc-45d1-8f01-1f3d94683458 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.358 187518 DEBUG nova.network.neutron [req-2bc3103a-ad30-41e5-b65c-f08f23d8f050 req-4b34ec2a-1182-41b5-a672-817bf34e3abe 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Updated VIF entry in instance network info cache for port f6ce3521-17c6-45ca-bf6e-55c091ec29c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.359 187518 DEBUG nova.network.neutron [req-2bc3103a-ad30-41e5-b65c-f08f23d8f050 req-4b34ec2a-1182-41b5-a672-817bf34e3abe 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Updating instance_info_cache with network_info: [{"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:57:13 np0005539279 nova_compute[187514]: 2025-11-29 00:57:13.374 187518 DEBUG oslo_concurrency.lockutils [req-2bc3103a-ad30-41e5-b65c-f08f23d8f050 req-4b34ec2a-1182-41b5-a672-817bf34e3abe 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:57:14 np0005539279 nova_compute[187514]: 2025-11-29 00:57:14.899 187518 DEBUG nova.compute.manager [req-7358a49e-2221-402f-9bd0-e1ba41565b88 req-febe36b2-4acd-4fdf-998a-b4d562dd6894 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received event network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:14 np0005539279 nova_compute[187514]: 2025-11-29 00:57:14.900 187518 DEBUG oslo_concurrency.lockutils [req-7358a49e-2221-402f-9bd0-e1ba41565b88 req-febe36b2-4acd-4fdf-998a-b4d562dd6894 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:14 np0005539279 nova_compute[187514]: 2025-11-29 00:57:14.900 187518 DEBUG oslo_concurrency.lockutils [req-7358a49e-2221-402f-9bd0-e1ba41565b88 req-febe36b2-4acd-4fdf-998a-b4d562dd6894 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:14 np0005539279 nova_compute[187514]: 2025-11-29 00:57:14.900 187518 DEBUG oslo_concurrency.lockutils [req-7358a49e-2221-402f-9bd0-e1ba41565b88 req-febe36b2-4acd-4fdf-998a-b4d562dd6894 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:14 np0005539279 nova_compute[187514]: 2025-11-29 00:57:14.900 187518 DEBUG nova.compute.manager [req-7358a49e-2221-402f-9bd0-e1ba41565b88 req-febe36b2-4acd-4fdf-998a-b4d562dd6894 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] No waiting events found dispatching network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:57:14 np0005539279 nova_compute[187514]: 2025-11-29 00:57:14.900 187518 WARNING nova.compute.manager [req-7358a49e-2221-402f-9bd0-e1ba41565b88 req-febe36b2-4acd-4fdf-998a-b4d562dd6894 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received unexpected event network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:57:16 np0005539279 nova_compute[187514]: 2025-11-29 00:57:16.838 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:17 np0005539279 nova_compute[187514]: 2025-11-29 00:57:17.368 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:17 np0005539279 nova_compute[187514]: 2025-11-29 00:57:17.651 187518 DEBUG nova.compute.manager [req-cc59c834-25b3-47c3-b32c-0afbd2a73ca1 req-ed14d6f8-9dd4-4af3-852d-817744f8e25e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received event network-changed-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:17 np0005539279 nova_compute[187514]: 2025-11-29 00:57:17.652 187518 DEBUG nova.compute.manager [req-cc59c834-25b3-47c3-b32c-0afbd2a73ca1 req-ed14d6f8-9dd4-4af3-852d-817744f8e25e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Refreshing instance network info cache due to event network-changed-f6ce3521-17c6-45ca-bf6e-55c091ec29c7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:57:17 np0005539279 nova_compute[187514]: 2025-11-29 00:57:17.653 187518 DEBUG oslo_concurrency.lockutils [req-cc59c834-25b3-47c3-b32c-0afbd2a73ca1 req-ed14d6f8-9dd4-4af3-852d-817744f8e25e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:57:17 np0005539279 nova_compute[187514]: 2025-11-29 00:57:17.653 187518 DEBUG oslo_concurrency.lockutils [req-cc59c834-25b3-47c3-b32c-0afbd2a73ca1 req-ed14d6f8-9dd4-4af3-852d-817744f8e25e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:57:17 np0005539279 nova_compute[187514]: 2025-11-29 00:57:17.653 187518 DEBUG nova.network.neutron [req-cc59c834-25b3-47c3-b32c-0afbd2a73ca1 req-ed14d6f8-9dd4-4af3-852d-817744f8e25e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Refreshing network info cache for port f6ce3521-17c6-45ca-bf6e-55c091ec29c7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:57:19 np0005539279 nova_compute[187514]: 2025-11-29 00:57:19.022 187518 DEBUG nova.network.neutron [req-cc59c834-25b3-47c3-b32c-0afbd2a73ca1 req-ed14d6f8-9dd4-4af3-852d-817744f8e25e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Updated VIF entry in instance network info cache for port f6ce3521-17c6-45ca-bf6e-55c091ec29c7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:57:19 np0005539279 nova_compute[187514]: 2025-11-29 00:57:19.023 187518 DEBUG nova.network.neutron [req-cc59c834-25b3-47c3-b32c-0afbd2a73ca1 req-ed14d6f8-9dd4-4af3-852d-817744f8e25e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Updating instance_info_cache with network_info: [{"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:57:19 np0005539279 nova_compute[187514]: 2025-11-29 00:57:19.052 187518 DEBUG oslo_concurrency.lockutils [req-cc59c834-25b3-47c3-b32c-0afbd2a73ca1 req-ed14d6f8-9dd4-4af3-852d-817744f8e25e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-7e8ca829-ad9a-476a-afe3-92e4f655c723" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:57:19 np0005539279 nova_compute[187514]: 2025-11-29 00:57:19.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:20 np0005539279 nova_compute[187514]: 2025-11-29 00:57:20.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:20 np0005539279 nova_compute[187514]: 2025-11-29 00:57:20.611 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:20 np0005539279 nova_compute[187514]: 2025-11-29 00:57:20.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:57:21 np0005539279 nova_compute[187514]: 2025-11-29 00:57:21.845 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.416 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.639 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.640 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.640 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.640 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.731 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.821 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.823 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.913 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:22 np0005539279 nova_compute[187514]: 2025-11-29 00:57:22.922 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.007 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.009 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.102 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.367 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.369 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5497MB free_disk=73.31324768066406GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.370 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.370 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.479 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Instance f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.480 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Instance 7e8ca829-ad9a-476a-afe3-92e4f655c723 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.481 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.481 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.561 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.578 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.613 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:57:23 np0005539279 nova_compute[187514]: 2025-11-29 00:57:23.614 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:23 np0005539279 podman[216172]: 2025-11-29 00:57:23.850210677 +0000 UTC m=+0.078272699 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 19:57:23 np0005539279 podman[216171]: 2025-11-29 00:57:23.859876219 +0000 UTC m=+0.090504301 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Nov 28 19:57:24 np0005539279 nova_compute[187514]: 2025-11-29 00:57:24.614 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:24 np0005539279 nova_compute[187514]: 2025-11-29 00:57:24.615 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:57:24 np0005539279 nova_compute[187514]: 2025-11-29 00:57:24.615 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:57:25 np0005539279 nova_compute[187514]: 2025-11-29 00:57:25.324 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:57:25 np0005539279 nova_compute[187514]: 2025-11-29 00:57:25.324 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquired lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:57:25 np0005539279 nova_compute[187514]: 2025-11-29 00:57:25.325 187518 DEBUG nova.network.neutron [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 19:57:25 np0005539279 nova_compute[187514]: 2025-11-29 00:57:25.326 187518 DEBUG nova.objects.instance [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:57:25 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:25Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:d1:7f 10.100.0.5
Nov 28 19:57:25 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:25Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:d1:7f 10.100.0.5
Nov 28 19:57:26 np0005539279 nova_compute[187514]: 2025-11-29 00:57:26.847 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:26 np0005539279 podman[216216]: 2025-11-29 00:57:26.85467619 +0000 UTC m=+0.077531058 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:57:26 np0005539279 nova_compute[187514]: 2025-11-29 00:57:26.884 187518 DEBUG nova.network.neutron [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updating instance_info_cache with network_info: [{"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:57:26 np0005539279 podman[216215]: 2025-11-29 00:57:26.896920248 +0000 UTC m=+0.123268511 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:57:26 np0005539279 nova_compute[187514]: 2025-11-29 00:57:26.901 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Releasing lock "refresh_cache-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:57:26 np0005539279 nova_compute[187514]: 2025-11-29 00:57:26.901 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 19:57:26 np0005539279 nova_compute[187514]: 2025-11-29 00:57:26.901 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:26 np0005539279 nova_compute[187514]: 2025-11-29 00:57:26.901 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:27 np0005539279 nova_compute[187514]: 2025-11-29 00:57:27.431 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:27 np0005539279 nova_compute[187514]: 2025-11-29 00:57:27.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:27 np0005539279 nova_compute[187514]: 2025-11-29 00:57:27.611 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:57:31 np0005539279 nova_compute[187514]: 2025-11-29 00:57:31.849 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.193 187518 INFO nova.compute.manager [None req-c3a3aa6a-65d9-4965-9cd2-b2ccd454e91b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Get console output#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.200 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.434 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.549 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "7e8ca829-ad9a-476a-afe3-92e4f655c723" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.550 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.550 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.551 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.551 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.553 187518 INFO nova.compute.manager [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Terminating instance#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.555 187518 DEBUG nova.compute.manager [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 19:57:32 np0005539279 kernel: tapf6ce3521-17 (unregistering): left promiscuous mode
Nov 28 19:57:32 np0005539279 NetworkManager[55703]: <info>  [1764377852.5785] device (tapf6ce3521-17): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 19:57:32 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:32Z|00085|binding|INFO|Releasing lport f6ce3521-17c6-45ca-bf6e-55c091ec29c7 from this chassis (sb_readonly=0)
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.623 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:32Z|00086|binding|INFO|Setting lport f6ce3521-17c6-45ca-bf6e-55c091ec29c7 down in Southbound
Nov 28 19:57:32 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:32Z|00087|binding|INFO|Removing iface tapf6ce3521-17 ovn-installed in OVS
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.631 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:d1:7f 10.100.0.5'], port_security=['fa:16:3e:11:d1:7f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '7e8ca829-ad9a-476a-afe3-92e4f655c723', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3464627b-636f-42dd-ae8e-b4b260cea225', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02a5c569-2c63-4158-9f46-b0626b15a7d2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.237'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aaff49b6-7ffb-4dbf-949e-3d42bb7e7357, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=f6ce3521-17c6-45ca-bf6e-55c091ec29c7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.633 104584 INFO neutron.agent.ovn.metadata.agent [-] Port f6ce3521-17c6-45ca-bf6e-55c091ec29c7 in datapath 3464627b-636f-42dd-ae8e-b4b260cea225 unbound from our chassis#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.634 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3464627b-636f-42dd-ae8e-b4b260cea225#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.656 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.656 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[fb54abff-9565-4ddc-8288-724b80998562]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:32 np0005539279 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 28 19:57:32 np0005539279 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 12.439s CPU time.
Nov 28 19:57:32 np0005539279 systemd-machined[153752]: Machine qemu-5-instance-00000005 terminated.
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.689 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[c89c950f-b600-4180-98ea-28c14f48d617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.691 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[34733933-3aa1-44c8-aa73-c9ef56c5ed7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:32.721 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0b304e36150c84a54efc5595a083f7ed4670ef1c7fd825658e3fee6080e7923d" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.728 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[59efd968-ac10-4b6a-a522-ca55b197c714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.762 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e25f60d7-731a-404a-a8db-f0326172371d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3464627b-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:b4:66'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372770, 'reachable_time': 19228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216279, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.791 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e1cd663a-fa54-4149-b01c-33904113ea24]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap3464627b-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372789, 'tstamp': 372789}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216281, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap3464627b-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372793, 'tstamp': 372793}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216281, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.795 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3464627b-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.796 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.802 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.803 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3464627b-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.803 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.804 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3464627b-60, col_values=(('external_ids', {'iface-id': '2e2525c3-99d8-49dc-9041-3095814f1167'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:32.805 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:57:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:32.820 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 29 Nov 2025 00:57:32 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-acc1fb48-a782-4098-8cb9-0cb835ddd56f x-openstack-request-id: req-acc1fb48-a782-4098-8cb9-0cb835ddd56f _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 19:57:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:32.821 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "0dde7307-f374-4695-a505-375ef97c22c2", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/0dde7307-f374-4695-a505-375ef97c22c2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/0dde7307-f374-4695-a505-375ef97c22c2"}]}, {"id": "6ce17e5f-9ac5-497d-adc9-1357453b4367", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/6ce17e5f-9ac5-497d-adc9-1357453b4367"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/6ce17e5f-9ac5-497d-adc9-1357453b4367"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 19:57:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:32.821 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-acc1fb48-a782-4098-8cb9-0cb835ddd56f request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 19:57:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:32.824 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/6ce17e5f-9ac5-497d-adc9-1357453b4367 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}0b304e36150c84a54efc5595a083f7ed4670ef1c7fd825658e3fee6080e7923d" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.841 187518 INFO nova.virt.libvirt.driver [-] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Instance destroyed successfully.#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.842 187518 DEBUG nova.objects.instance [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid 7e8ca829-ad9a-476a-afe3-92e4f655c723 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.865 187518 DEBUG nova.virt.libvirt.vif [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1750823135',display_name='tempest-TestNetworkBasicOps-server-1750823135',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1750823135',id=5,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCP43JqvRGAxWdvk3K3a0G6FyqPzv+FFHCpIJnBjZ0AHMx6QrOf6TDyM8pvxhfQgXbL5drg0ciJrBrarxp8AXjh0nfAH+ZBgtvaGK28fLVWywfb9yZbA262SAEx+aVtg7g==',key_name='tempest-TestNetworkBasicOps-1823002382',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:57:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-fpl15jni',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:57:13Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=7e8ca829-ad9a-476a-afe3-92e4f655c723,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.865 187518 DEBUG nova.network.os_vif_util [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "address": "fa:16:3e:11:d1:7f", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6ce3521-17", "ovs_interfaceid": "f6ce3521-17c6-45ca-bf6e-55c091ec29c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.866 187518 DEBUG nova.network.os_vif_util [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:d1:7f,bridge_name='br-int',has_traffic_filtering=True,id=f6ce3521-17c6-45ca-bf6e-55c091ec29c7,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6ce3521-17') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.867 187518 DEBUG os_vif [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:d1:7f,bridge_name='br-int',has_traffic_filtering=True,id=f6ce3521-17c6-45ca-bf6e-55c091ec29c7,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6ce3521-17') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.869 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.870 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6ce3521-17, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.872 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.874 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.875 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.879 187518 INFO os_vif [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:d1:7f,bridge_name='br-int',has_traffic_filtering=True,id=f6ce3521-17c6-45ca-bf6e-55c091ec29c7,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6ce3521-17')#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.880 187518 INFO nova.virt.libvirt.driver [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Deleting instance files /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723_del#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.881 187518 INFO nova.virt.libvirt.driver [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Deletion of /var/lib/nova/instances/7e8ca829-ad9a-476a-afe3-92e4f655c723_del complete#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.904 187518 DEBUG nova.compute.manager [req-38931273-d044-4485-b938-d35e4da6ef05 req-e8932f6d-cb27-4c95-bb29-55a96e5f42af 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received event network-vif-unplugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.904 187518 DEBUG oslo_concurrency.lockutils [req-38931273-d044-4485-b938-d35e4da6ef05 req-e8932f6d-cb27-4c95-bb29-55a96e5f42af 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.905 187518 DEBUG oslo_concurrency.lockutils [req-38931273-d044-4485-b938-d35e4da6ef05 req-e8932f6d-cb27-4c95-bb29-55a96e5f42af 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.905 187518 DEBUG oslo_concurrency.lockutils [req-38931273-d044-4485-b938-d35e4da6ef05 req-e8932f6d-cb27-4c95-bb29-55a96e5f42af 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.905 187518 DEBUG nova.compute.manager [req-38931273-d044-4485-b938-d35e4da6ef05 req-e8932f6d-cb27-4c95-bb29-55a96e5f42af 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] No waiting events found dispatching network-vif-unplugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.906 187518 DEBUG nova.compute.manager [req-38931273-d044-4485-b938-d35e4da6ef05 req-e8932f6d-cb27-4c95-bb29-55a96e5f42af 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received event network-vif-unplugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.940 187518 INFO nova.compute.manager [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.940 187518 DEBUG oslo.service.loopingcall [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.941 187518 DEBUG nova.compute.manager [-] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 19:57:32 np0005539279 nova_compute[187514]: 2025-11-29 00:57:32.941 187518 DEBUG nova.network.neutron [-] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.002 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 29 Nov 2025 00:57:32 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-353bf1db-6f79-413b-ba25-7c7973d926ad x-openstack-request-id: req-353bf1db-6f79-413b-ba25-7c7973d926ad _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.002 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "6ce17e5f-9ac5-497d-adc9-1357453b4367", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/6ce17e5f-9ac5-497d-adc9-1357453b4367"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/6ce17e5f-9ac5-497d-adc9-1357453b4367"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.002 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/6ce17e5f-9ac5-497d-adc9-1357453b4367 used request id req-353bf1db-6f79-413b-ba25-7c7973d926ad request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.004 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'name': 'tempest-TestNetworkBasicOps-server-302931196', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0df0de37c7d74836a2135b0d6ff3a067', 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'hostId': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e8ca829-ad9a-476a-afe3-92e4f655c723' (instance-00000005)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '7e8ca829-ad9a-476a-afe3-92e4f655c723' (instance-00000005)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.005 12 ERROR ceilometer.compute.virt.libvirt.utils [-] Fail to get domain uuid 7e8ca829-ad9a-476a-afe3-92e4f655c723 metadata, libvirtError: Domain not found: no domain with matching uuid '7e8ca829-ad9a-476a-afe3-92e4f655c723' (instance-00000005)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.006 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.051 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.read.bytes volume: 30591488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.052 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77c5fa38-49db-49f3-b229-b21697fe3c6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30591488, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.006643', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '634fed32-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': '64809fa4412456db3693b6a2c7678adce38055d8b2fc0eb3502b4d4546d764f2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.006643', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '63500de4-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': '813d9c988b5776da83528c355aa3731aee7a6476c57b299b21ef12cc44f47524'}]}, 'timestamp': '2025-11-29 00:57:33.053160', '_unique_id': 'd6747855c0c643feb97fd6f2ed6d0324'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.064 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.070 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.076 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 / tapa8cbd84f-18 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.076 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '921e445d-23e6-47bb-99f2-6dbc07821149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.071981', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '6353bbb0-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': 'cfd7fc45e79654b3e65d6bdde1a151b4b9d4b12a0766aea6d426aac91403c877'}]}, 'timestamp': '2025-11-29 00:57:33.077246', '_unique_id': '13bc102a1f43470f93135eaa9c0c5aeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.078 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.079 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.079 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffc61638-a6d0-4b6e-a4f3-18e35f82da73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.079551', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '635427bc-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': '5bf7443d691883c182ea979ae41ea52b6f12d4d2f607c8d82bfbcec59404bb92'}]}, 'timestamp': '2025-11-29 00:57:33.079940', '_unique_id': 'e82387931fb04e748073031cb7df02dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.080 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.081 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.081 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.081 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-302931196>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-302931196>]
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.082 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.100 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.101 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edad8ce6-0c5b-4e23-8979-6a44cf3c4a14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.082573', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6357589c-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.841373286, 'message_signature': 'ddcbf264c350acde8baf80e4a3855e4d5f58ef62543a9117853041c3b4dfa5f6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.082573', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '635772aa-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.841373286, 'message_signature': 'de39e1f564990f5d0a46bf473ceebe67a78534c93a7731dd0c919b22c4e858b4'}]}, 'timestamp': '2025-11-29 00:57:33.101662', '_unique_id': 'e49f2008f45f490ea69e084504c23e19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.103 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.105 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.105 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7429a285-8bf7-4d65-a710-fc37d3ed2917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.105022', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '63580c9c-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.841373286, 'message_signature': '8d11e08f926d235a3138fe3f1109aec2506c2c17cc1dc62a4a00cf90560b63f7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.105022', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6358201a-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.841373286, 'message_signature': 'ae1d4b22e8ff4bddc766870e313afe629b7eccad70e6bef40488ab99824aa1f3'}]}, 'timestamp': '2025-11-29 00:57:33.105987', '_unique_id': 'ef0793fd51a04b448059f72bac522aee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.107 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.108 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.108 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.incoming.bytes volume: 19432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee23b0d5-8bcc-42f0-b31f-ae54cc6aa062', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19432, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.108323', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '63588f6e-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': '064a1da5ac055ec7722fca6e8a54fe18709b5a1e6c846808fea94e7149879a20'}]}, 'timestamp': '2025-11-29 00:57:33.108892', '_unique_id': '1b4d835032f4418f863258ce5d3f721f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.110 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.111 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.111 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.write.bytes volume: 73105408 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.111 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f080b3c7-d6fc-44be-aaf8-99772c56e3fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73105408, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.111380', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '63590674-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': 'b3510a91b87827200444a1e24f2f63e276c7b5edd87c6881fc7ad2b616d59da7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.111380', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '635917f4-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': 'bc7eb246a16d7e8a6d30cf31eecf227985d302acb80018b3407cd6dcbd34b5be'}]}, 'timestamp': '2025-11-29 00:57:33.112328', '_unique_id': 'e22bdd7964784b6689fe40e80f36e99d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.113 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.114 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.114 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.114 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-302931196>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-302931196>]
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.115 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.115 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab369a4c-7797-4340-8c5a-b1e8d13468f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.115334', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '6359a106-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': 'd7fd3866948824627ad6948386095a66ee5017333b81508c5ce2600bec902ab9'}]}, 'timestamp': '2025-11-29 00:57:33.115869', '_unique_id': '316e89c87dfa44bf9b9d997488d8d5d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.116 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.118 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6ff6e7d-7056-42d2-9538-9dfccbeed8ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.118702', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '635a25a4-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': '3c5833da58cd3ecfe69fd6ed0f2eddbd16ef34fc468b6eb98c44138de9e21415'}]}, 'timestamp': '2025-11-29 00:57:33.119326', '_unique_id': '13fc765504ca4faa90903e45ae1404b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.120 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.121 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.121 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.read.requests volume: 1109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.122 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41fa22bc-5420-41f5-9e3f-175caaae4a91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1109, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.121731', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '635a98d6-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': '8299d8b412255fc9ba061cc1991cddc3674edc1c263521b61e78a4ff866d5ab8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.121731', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '635aa998-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': 'ce5a418d9bc45da0313ac8827cd30bbd43e0dc34b8b78575289333d834b5ef24'}]}, 'timestamp': '2025-11-29 00:57:33.122640', '_unique_id': '02cdbe31c1d84177a2f7a1303bb8ac3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.123 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.125 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.125 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ba5c4a1-ae5d-4dd3-b29c-373b3e439140', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.125095', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '635b1c7a-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.841373286, 'message_signature': '6dc305d4be4cb2c18fe2deecacdff50c2dd3145067b5b5bb38e17ecbafd6f4c2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.125095', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '635b2954-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.841373286, 'message_signature': '6c9cd334e73580796ee4c49578e11bfe41fbf6631234ed5208d71480230c8946'}]}, 'timestamp': '2025-11-29 00:57:33.125807', '_unique_id': '349bd16144f14df0ad6f40fc47710d5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.126 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.127 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.outgoing.packets volume: 111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0150da93-97ea-4896-8710-8bb97f474635', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 111, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.127626', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '635b7cf6-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': 'c809498e645f54fc17d1c829c88b2267175b62c62403c3594bc5ca24e04a3f5c'}]}, 'timestamp': '2025-11-29 00:57:33.128056', '_unique_id': 'c6f4807646a145a0b2cf98527774e1d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.128 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.130 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.write.latency volume: 2173912316 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.130 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db8df326-792e-4950-839d-e4019beff0d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2173912316, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.130036', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '635bdb88-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': '64153c0047e3a91b18e492e4a0aca7cc93d901c7cdb62566e8a76561db434e3b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.130036', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '635becd6-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': '3a7f17917e750fb334b6015c5956311465800134cb36ebb0c4c44212e0f1b2d1'}]}, 'timestamp': '2025-11-29 00:57:33.130858', '_unique_id': '4cdad74d13ef4d83b767efe7d69f70a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.131 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.132 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.155 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/memory.usage volume: 42.67578125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '036e8c8b-b463-4287-b4b8-48de12287f08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.67578125, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'timestamp': '2025-11-29T00:57:33.132610', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '635fccc0-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.913717293, 'message_signature': 'a6d4e5faad0785ceb62ec5670c9e7787ba443b50106ba0a4334891e9afb8856e'}]}, 'timestamp': '2025-11-29 00:57:33.156388', '_unique_id': 'bea75114c6694502a113e549857ae98b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.157 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.159 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.incoming.packets volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99a7508b-26d7-400d-8161-7ffc8b44ae90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 110, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.159581', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '636060e0-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': '11a161dd4dbe9d68ee5f7447f39c863151dda4326df79976916ac03a05b796aa'}]}, 'timestamp': '2025-11-29 00:57:33.160108', '_unique_id': 'ec3424ddc93e453f9450dd53fb8685ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.161 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.162 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.outgoing.bytes volume: 16150 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c646c22-156e-4a3a-a205-6c0c13e0e887', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16150, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.162378', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '6360d12e-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': '5f87b920c3820b9998e55d1f2c935ec65060f01ff9fec59b635840ef2f00095e'}]}, 'timestamp': '2025-11-29 00:57:33.162965', '_unique_id': '9dba3a0abbd94c4c9b53362aea59bf4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.163 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.165 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.165 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-302931196>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-302931196>]
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.165 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.166 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8759ba96-fbd2-4469-b3ce-f0ad4df0a988', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.165663', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '63614d2a-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': '3f41b0647bcfc52dcac750d30e0041e6b34f4522603b2c4693bc9547a17c4d36'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.165663', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '63615f4a-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': '420a0656ed371f1c71bfa4881f3d4f67c419592663e3f43705cf56a96964af3d'}]}, 'timestamp': '2025-11-29 00:57:33.166624', '_unique_id': '2116120708b7405eb4df6ebc1aecd232'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.167 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.168 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.168 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-302931196>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-302931196>]
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.168 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.read.latency volume: 282912367 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.169 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/disk.device.read.latency volume: 26661517 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd73223eb-aee5-47eb-9be4-fd73dbbdf9cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 282912367, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-vda', 'timestamp': '2025-11-29T00:57:33.168817', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6361c778-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': 'e1a61d02c6a63e78a74d8bf41cfde2750dd03eea4437b70e7f4df7818c168f2d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26661517, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-sda', 'timestamp': '2025-11-29T00:57:33.168817', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6361d588-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.765478003, 'message_signature': 'b36adc71086149a18edf95fb8a267ba9d0441db73fa28f2e37924e3a8ceda2f2'}]}, 'timestamp': '2025-11-29 00:57:33.169579', '_unique_id': '9eb5c66aec7f442db40dcd9f5211fda5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.170 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.171 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.171 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/cpu volume: 11030000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38f7d21d-3b44-4947-911a-601e5dc6447a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11030000000, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'timestamp': '2025-11-29T00:57:33.171264', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'instance-00000004', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '63622506-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.913717293, 'message_signature': '2dae9b30a9e829b10b6feb30f7431b9dc4197334647ac44b2a9a15960c706ed4'}]}, 'timestamp': '2025-11-29 00:57:33.171611', '_unique_id': '01d7bd62792c4e30ab3695390a43411a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.172 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.173 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1ec7eda-466c-4dbd-9177-847b5cb1911c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.173430', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '63627b50-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': 'cdbc50e8a8b0892557646c97e060cd594da63f94cb2e73eed17b46f033ce15ba'}]}, 'timestamp': '2025-11-29 00:57:33.173810', '_unique_id': '22a90fbc09c14ca994ac4620b78c2f26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.174 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.175 12 DEBUG ceilometer.compute.pollsters [-] f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '915e6591-3139-48a1-8ad4-4a1a9b7a38d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000004-f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-tapa8cbd84f-18', 'timestamp': '2025-11-29T00:57:33.175465', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-302931196', 'name': 'tapa8cbd84f-18', 'instance_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:56:7e:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa8cbd84f-18'}, 'message_id': '6362ca7e-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3786.83084854, 'message_signature': '98bf98697cb926dfc012b8d35485510112611cf1c594dd1d14492f31eb9611c5'}]}, 'timestamp': '2025-11-29 00:57:33.175830', '_unique_id': '2ec49c253e01481aa877c278d2b89e6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:57:33 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:57:33.176 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:57:33 np0005539279 nova_compute[187514]: 2025-11-29 00:57:33.887 187518 DEBUG nova.network.neutron [-] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:57:33 np0005539279 nova_compute[187514]: 2025-11-29 00:57:33.910 187518 INFO nova.compute.manager [-] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Took 0.97 seconds to deallocate network for instance.#033[00m
Nov 28 19:57:33 np0005539279 nova_compute[187514]: 2025-11-29 00:57:33.974 187518 DEBUG nova.compute.manager [req-04063494-eea8-47d0-a729-874cf2499d10 req-524f7a5f-67e1-4dbc-821e-6eb452077f08 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received event network-vif-deleted-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:33 np0005539279 nova_compute[187514]: 2025-11-29 00:57:33.977 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:33 np0005539279 nova_compute[187514]: 2025-11-29 00:57:33.978 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:34 np0005539279 nova_compute[187514]: 2025-11-29 00:57:34.057 187518 DEBUG nova.compute.provider_tree [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:57:34 np0005539279 nova_compute[187514]: 2025-11-29 00:57:34.077 187518 DEBUG nova.scheduler.client.report [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:57:34 np0005539279 nova_compute[187514]: 2025-11-29 00:57:34.110 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:34 np0005539279 nova_compute[187514]: 2025-11-29 00:57:34.135 187518 INFO nova.scheduler.client.report [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance 7e8ca829-ad9a-476a-afe3-92e4f655c723#033[00m
Nov 28 19:57:34 np0005539279 nova_compute[187514]: 2025-11-29 00:57:34.204 187518 DEBUG oslo_concurrency.lockutils [None req-23bc1784-7c68-4637-9208-43bbac01b29e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:35 np0005539279 nova_compute[187514]: 2025-11-29 00:57:35.024 187518 DEBUG nova.compute.manager [req-24e631db-78c0-4fc3-aa48-c420ddd4a8d1 req-7e238006-dd5d-4bad-a679-433091165d99 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received event network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:35 np0005539279 nova_compute[187514]: 2025-11-29 00:57:35.024 187518 DEBUG oslo_concurrency.lockutils [req-24e631db-78c0-4fc3-aa48-c420ddd4a8d1 req-7e238006-dd5d-4bad-a679-433091165d99 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:35 np0005539279 nova_compute[187514]: 2025-11-29 00:57:35.025 187518 DEBUG oslo_concurrency.lockutils [req-24e631db-78c0-4fc3-aa48-c420ddd4a8d1 req-7e238006-dd5d-4bad-a679-433091165d99 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:35 np0005539279 nova_compute[187514]: 2025-11-29 00:57:35.025 187518 DEBUG oslo_concurrency.lockutils [req-24e631db-78c0-4fc3-aa48-c420ddd4a8d1 req-7e238006-dd5d-4bad-a679-433091165d99 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7e8ca829-ad9a-476a-afe3-92e4f655c723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:35 np0005539279 nova_compute[187514]: 2025-11-29 00:57:35.026 187518 DEBUG nova.compute.manager [req-24e631db-78c0-4fc3-aa48-c420ddd4a8d1 req-7e238006-dd5d-4bad-a679-433091165d99 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] No waiting events found dispatching network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:57:35 np0005539279 nova_compute[187514]: 2025-11-29 00:57:35.026 187518 WARNING nova.compute.manager [req-24e631db-78c0-4fc3-aa48-c420ddd4a8d1 req-7e238006-dd5d-4bad-a679-433091165d99 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Received unexpected event network-vif-plugged-f6ce3521-17c6-45ca-bf6e-55c091ec29c7 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 19:57:37 np0005539279 nova_compute[187514]: 2025-11-29 00:57:37.437 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:37 np0005539279 nova_compute[187514]: 2025-11-29 00:57:37.873 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:38.551 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.551 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:38.553 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.835 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.836 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.837 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.838 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.839 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.841 187518 INFO nova.compute.manager [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Terminating instance#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.844 187518 DEBUG nova.compute.manager [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 19:57:38 np0005539279 kernel: tapa8cbd84f-18 (unregistering): left promiscuous mode
Nov 28 19:57:38 np0005539279 NetworkManager[55703]: <info>  [1764377858.8746] device (tapa8cbd84f-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 19:57:38 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:38Z|00088|binding|INFO|Releasing lport a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 from this chassis (sb_readonly=0)
Nov 28 19:57:38 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:38Z|00089|binding|INFO|Setting lport a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 down in Southbound
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.919 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:38 np0005539279 ovn_controller[95686]: 2025-11-29T00:57:38Z|00090|binding|INFO|Removing iface tapa8cbd84f-18 ovn-installed in OVS
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.929 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:38.932 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:56:7e:a1 10.100.0.3'], port_security=['fa:16:3e:56:7e:a1 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3464627b-636f-42dd-ae8e-b4b260cea225', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2f94587-1ade-4577-8fa4-d6db6a73fd0b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=aaff49b6-7ffb-4dbf-949e-3d42bb7e7357, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:57:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:38.934 104584 INFO neutron.agent.ovn.metadata.agent [-] Port a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 in datapath 3464627b-636f-42dd-ae8e-b4b260cea225 unbound from our chassis#033[00m
Nov 28 19:57:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:38.936 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3464627b-636f-42dd-ae8e-b4b260cea225, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 19:57:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:38.936 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a7112b-a719-4c0b-8867-c691dddb7589]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:38.938 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225 namespace which is not needed anymore#033[00m
Nov 28 19:57:38 np0005539279 nova_compute[187514]: 2025-11-29 00:57:38.948 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:38 np0005539279 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Nov 28 19:57:38 np0005539279 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 14.916s CPU time.
Nov 28 19:57:38 np0005539279 systemd-machined[153752]: Machine qemu-4-instance-00000004 terminated.
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.134 187518 INFO nova.virt.libvirt.driver [-] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Instance destroyed successfully.#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.137 187518 DEBUG nova.objects.instance [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:57:39 np0005539279 neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225[215821]: [NOTICE]   (215827) : haproxy version is 2.8.14-c23fe91
Nov 28 19:57:39 np0005539279 neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225[215821]: [NOTICE]   (215827) : path to executable is /usr/sbin/haproxy
Nov 28 19:57:39 np0005539279 neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225[215821]: [WARNING]  (215827) : Exiting Master process...
Nov 28 19:57:39 np0005539279 neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225[215821]: [ALERT]    (215827) : Current worker (215829) exited with code 143 (Terminated)
Nov 28 19:57:39 np0005539279 neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225[215821]: [WARNING]  (215827) : All workers exited. Exiting... (0)
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.162 187518 DEBUG nova.virt.libvirt.vif [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:56:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-302931196',display_name='tempest-TestNetworkBasicOps-server-302931196',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-302931196',id=4,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEGVEwm853y2eS199MxO6IEnfj3smPW9ngtv23k+04V7lQDOEde/4DwChpU/dPMjVKi0udHxpzA16alvdHRWsfEQmqVLBInT31956bVheL4YLYKxXq/G18LlYLmDWEYX7Q==',key_name='tempest-TestNetworkBasicOps-734851356',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:56:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-70t6y47o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:56:34Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.163 187518 DEBUG nova.network.os_vif_util [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "address": "fa:16:3e:56:7e:a1", "network": {"id": "3464627b-636f-42dd-ae8e-b4b260cea225", "bridge": "br-int", "label": "tempest-network-smoke--1682105530", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cbd84f-18", "ovs_interfaceid": "a8cbd84f-18a7-4baf-9ce9-0617d15f9c10", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:57:39 np0005539279 systemd[1]: libpod-aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f.scope: Deactivated successfully.
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.164 187518 DEBUG nova.network.os_vif_util [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:56:7e:a1,bridge_name='br-int',has_traffic_filtering=True,id=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cbd84f-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.165 187518 DEBUG os_vif [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:7e:a1,bridge_name='br-int',has_traffic_filtering=True,id=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cbd84f-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.167 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.167 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8cbd84f-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.170 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:39 np0005539279 podman[216330]: 2025-11-29 00:57:39.169593869 +0000 UTC m=+0.070396164 container died aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.174 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.178 187518 INFO os_vif [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:56:7e:a1,bridge_name='br-int',has_traffic_filtering=True,id=a8cbd84f-18a7-4baf-9ce9-0617d15f9c10,network=Network(3464627b-636f-42dd-ae8e-b4b260cea225),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cbd84f-18')#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.179 187518 INFO nova.virt.libvirt.driver [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Deleting instance files /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418_del#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.180 187518 INFO nova.virt.libvirt.driver [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Deletion of /var/lib/nova/instances/f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418_del complete#033[00m
Nov 28 19:57:39 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f-userdata-shm.mount: Deactivated successfully.
Nov 28 19:57:39 np0005539279 systemd[1]: var-lib-containers-storage-overlay-833d507e5fada38f13b7042c49a427b0a536e27362584a7072f886b0fa7b58df-merged.mount: Deactivated successfully.
Nov 28 19:57:39 np0005539279 podman[216330]: 2025-11-29 00:57:39.227846873 +0000 UTC m=+0.128649138 container cleanup aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 19:57:39 np0005539279 systemd[1]: libpod-conmon-aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f.scope: Deactivated successfully.
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.257 187518 INFO nova.compute.manager [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.258 187518 DEBUG oslo.service.loopingcall [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.259 187518 DEBUG nova.compute.manager [-] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.259 187518 DEBUG nova.network.neutron [-] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 19:57:39 np0005539279 podman[216372]: 2025-11-29 00:57:39.33445548 +0000 UTC m=+0.066540559 container remove aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.342 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[3bfb646d-7521-4788-92ed-5e5000c72d2f]: (4, ('Sat Nov 29 12:57:39 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225 (aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f)\naab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f\nSat Nov 29 12:57:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225 (aab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f)\naab4399313b92513f2f19ab4de27270196eb85eeaeb5d4ffee63fde555c0f08f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.344 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c56b77b9-f32a-490b-9ae8-fc3e1fe1aefa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.345 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3464627b-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.347 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:39 np0005539279 kernel: tap3464627b-60: left promiscuous mode
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.374 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.378 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0fc8866f-eaa1-4659-8075-ec896a096e5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.398 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[1021a87f-5a0c-45a7-9a4e-815ff56f5b46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.400 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7d6edb-82c1-47ad-b4e0-4a9904a73579]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.424 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[ab37699c-e823-45e8-a5f4-6b45291bdae0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372759, 'reachable_time': 28641, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216387, 'error': None, 'target': 'ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.428 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3464627b-636f-42dd-ae8e-b4b260cea225 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 19:57:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:39.428 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[7029a9f9-37f3-45fc-ae42-1652735d8fef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:57:39 np0005539279 systemd[1]: run-netns-ovnmeta\x2d3464627b\x2d636f\x2d42dd\x2dae8e\x2db4b260cea225.mount: Deactivated successfully.
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.520 187518 DEBUG nova.compute.manager [req-4b09d3e0-923d-4b28-aea3-70cd57b046a1 req-55d9cc72-877d-4549-80ec-d86b1b0b93b7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-vif-unplugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.521 187518 DEBUG oslo_concurrency.lockutils [req-4b09d3e0-923d-4b28-aea3-70cd57b046a1 req-55d9cc72-877d-4549-80ec-d86b1b0b93b7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.522 187518 DEBUG oslo_concurrency.lockutils [req-4b09d3e0-923d-4b28-aea3-70cd57b046a1 req-55d9cc72-877d-4549-80ec-d86b1b0b93b7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.523 187518 DEBUG oslo_concurrency.lockutils [req-4b09d3e0-923d-4b28-aea3-70cd57b046a1 req-55d9cc72-877d-4549-80ec-d86b1b0b93b7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.523 187518 DEBUG nova.compute.manager [req-4b09d3e0-923d-4b28-aea3-70cd57b046a1 req-55d9cc72-877d-4549-80ec-d86b1b0b93b7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] No waiting events found dispatching network-vif-unplugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.524 187518 DEBUG nova.compute.manager [req-4b09d3e0-923d-4b28-aea3-70cd57b046a1 req-55d9cc72-877d-4549-80ec-d86b1b0b93b7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-vif-unplugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.830 187518 DEBUG nova.network.neutron [-] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.853 187518 INFO nova.compute.manager [-] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Took 0.59 seconds to deallocate network for instance.#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.883 187518 DEBUG nova.compute.manager [req-d0477c03-7d66-4f21-996c-6d2837110f00 req-bdcd1b93-4695-4a5e-8882-0e6a61dd07a5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-vif-deleted-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.905 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.906 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.971 187518 DEBUG nova.compute.provider_tree [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:57:39 np0005539279 nova_compute[187514]: 2025-11-29 00:57:39.986 187518 DEBUG nova.scheduler.client.report [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:57:40 np0005539279 nova_compute[187514]: 2025-11-29 00:57:40.007 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:40 np0005539279 nova_compute[187514]: 2025-11-29 00:57:40.029 187518 INFO nova.scheduler.client.report [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418#033[00m
Nov 28 19:57:40 np0005539279 nova_compute[187514]: 2025-11-29 00:57:40.096 187518 DEBUG oslo_concurrency.lockutils [None req-99244fb6-ba59-4012-9a89-f211ebb37025 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:41 np0005539279 nova_compute[187514]: 2025-11-29 00:57:41.651 187518 DEBUG nova.compute.manager [req-32d80385-3193-4dcb-8f74-279ab60fb443 req-8d957260-40a4-4fcb-9888-2a5ed70e41df 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received event network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:57:41 np0005539279 nova_compute[187514]: 2025-11-29 00:57:41.652 187518 DEBUG oslo_concurrency.lockutils [req-32d80385-3193-4dcb-8f74-279ab60fb443 req-8d957260-40a4-4fcb-9888-2a5ed70e41df 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:41 np0005539279 nova_compute[187514]: 2025-11-29 00:57:41.652 187518 DEBUG oslo_concurrency.lockutils [req-32d80385-3193-4dcb-8f74-279ab60fb443 req-8d957260-40a4-4fcb-9888-2a5ed70e41df 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:41 np0005539279 nova_compute[187514]: 2025-11-29 00:57:41.653 187518 DEBUG oslo_concurrency.lockutils [req-32d80385-3193-4dcb-8f74-279ab60fb443 req-8d957260-40a4-4fcb-9888-2a5ed70e41df 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:41 np0005539279 nova_compute[187514]: 2025-11-29 00:57:41.653 187518 DEBUG nova.compute.manager [req-32d80385-3193-4dcb-8f74-279ab60fb443 req-8d957260-40a4-4fcb-9888-2a5ed70e41df 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] No waiting events found dispatching network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:57:41 np0005539279 nova_compute[187514]: 2025-11-29 00:57:41.654 187518 WARNING nova.compute.manager [req-32d80385-3193-4dcb-8f74-279ab60fb443 req-8d957260-40a4-4fcb-9888-2a5ed70e41df 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Received unexpected event network-vif-plugged-a8cbd84f-18a7-4baf-9ce9-0617d15f9c10 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 19:57:41 np0005539279 podman[216389]: 2025-11-29 00:57:41.837571047 +0000 UTC m=+0.077240361 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 19:57:41 np0005539279 podman[216388]: 2025-11-29 00:57:41.841081592 +0000 UTC m=+0.075956175 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 19:57:42 np0005539279 nova_compute[187514]: 2025-11-29 00:57:42.440 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:42 np0005539279 podman[216433]: 2025-11-29 00:57:42.834808223 +0000 UTC m=+0.075276447 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 19:57:44 np0005539279 nova_compute[187514]: 2025-11-29 00:57:44.178 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:44 np0005539279 nova_compute[187514]: 2025-11-29 00:57:44.777 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:44 np0005539279 nova_compute[187514]: 2025-11-29 00:57:44.881 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:46 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:57:46.556 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:57:47 np0005539279 nova_compute[187514]: 2025-11-29 00:57:47.442 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:47 np0005539279 nova_compute[187514]: 2025-11-29 00:57:47.840 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764377852.8388708, 7e8ca829-ad9a-476a-afe3-92e4f655c723 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:57:47 np0005539279 nova_compute[187514]: 2025-11-29 00:57:47.840 187518 INFO nova.compute.manager [-] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] VM Stopped (Lifecycle Event)#033[00m
Nov 28 19:57:47 np0005539279 nova_compute[187514]: 2025-11-29 00:57:47.867 187518 DEBUG nova.compute.manager [None req-534173f3-0529-4cc7-9ed5-3191c7245e92 - - - - - -] [instance: 7e8ca829-ad9a-476a-afe3-92e4f655c723] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:57:49 np0005539279 nova_compute[187514]: 2025-11-29 00:57:49.188 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:52 np0005539279 nova_compute[187514]: 2025-11-29 00:57:52.444 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:54 np0005539279 nova_compute[187514]: 2025-11-29 00:57:54.133 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764377859.1310797, f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:57:54 np0005539279 nova_compute[187514]: 2025-11-29 00:57:54.134 187518 INFO nova.compute.manager [-] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] VM Stopped (Lifecycle Event)#033[00m
Nov 28 19:57:54 np0005539279 nova_compute[187514]: 2025-11-29 00:57:54.172 187518 DEBUG nova.compute.manager [None req-94e4c1f7-4526-4bdc-a861-418498ef9e80 - - - - - -] [instance: f4a54d8a-2f31-42b1-b7a9-b2b6d75d4418] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:57:54 np0005539279 nova_compute[187514]: 2025-11-29 00:57:54.190 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:54 np0005539279 podman[216461]: 2025-11-29 00:57:54.851109134 +0000 UTC m=+0.088120056 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:57:54 np0005539279 podman[216460]: 2025-11-29 00:57:54.862449883 +0000 UTC m=+0.103059793 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.440 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.441 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.446 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.465 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.575 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.576 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.586 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.586 187518 INFO nova.compute.claims [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.743 187518 DEBUG nova.compute.provider_tree [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.776 187518 DEBUG nova.scheduler.client.report [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.817 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.818 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 19:57:57 np0005539279 podman[216508]: 2025-11-29 00:57:57.861052676 +0000 UTC m=+0.090474540 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.917 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.917 187518 DEBUG nova.network.neutron [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:57:57 np0005539279 podman[216507]: 2025-11-29 00:57:57.925744905 +0000 UTC m=+0.152530247 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.953 187518 INFO nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 19:57:57 np0005539279 nova_compute[187514]: 2025-11-29 00:57:57.979 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.072 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.074 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.074 187518 INFO nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Creating image(s)#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.075 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.075 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.076 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.093 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.177 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.178 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.179 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.195 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.269 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.270 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.305 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.307 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.307 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.374 187518 DEBUG nova.policy [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.382 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.383 187518 DEBUG nova.virt.disk.api [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.383 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.438 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.439 187518 DEBUG nova.virt.disk.api [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.440 187518 DEBUG nova.objects.instance [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.459 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.460 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Ensure instance console log exists: /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.460 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.461 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:57:58 np0005539279 nova_compute[187514]: 2025-11-29 00:57:58.462 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:57:59 np0005539279 nova_compute[187514]: 2025-11-29 00:57:59.191 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:57:59 np0005539279 nova_compute[187514]: 2025-11-29 00:57:59.697 187518 DEBUG nova.network.neutron [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Successfully created port: 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:58:01 np0005539279 nova_compute[187514]: 2025-11-29 00:58:01.370 187518 DEBUG nova.network.neutron [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Successfully updated port: 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:58:01 np0005539279 nova_compute[187514]: 2025-11-29 00:58:01.393 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:58:01 np0005539279 nova_compute[187514]: 2025-11-29 00:58:01.393 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:58:01 np0005539279 nova_compute[187514]: 2025-11-29 00:58:01.394 187518 DEBUG nova.network.neutron [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:58:01 np0005539279 nova_compute[187514]: 2025-11-29 00:58:01.495 187518 DEBUG nova.compute.manager [req-89f616e0-76c0-4dcf-b36c-dbbe1de9e8ce req-de2c03ca-d6bb-44fd-9022-6b28e53f1175 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-changed-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:01 np0005539279 nova_compute[187514]: 2025-11-29 00:58:01.496 187518 DEBUG nova.compute.manager [req-89f616e0-76c0-4dcf-b36c-dbbe1de9e8ce req-de2c03ca-d6bb-44fd-9022-6b28e53f1175 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing instance network info cache due to event network-changed-619a3ed2-fa55-4d60-8e37-9fd4ff488e12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:58:01 np0005539279 nova_compute[187514]: 2025-11-29 00:58:01.496 187518 DEBUG oslo_concurrency.lockutils [req-89f616e0-76c0-4dcf-b36c-dbbe1de9e8ce req-de2c03ca-d6bb-44fd-9022-6b28e53f1175 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:58:01 np0005539279 nova_compute[187514]: 2025-11-29 00:58:01.590 187518 DEBUG nova.network.neutron [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.447 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.576 187518 DEBUG nova.network.neutron [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.601 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.602 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Instance network_info: |[{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.602 187518 DEBUG oslo_concurrency.lockutils [req-89f616e0-76c0-4dcf-b36c-dbbe1de9e8ce req-de2c03ca-d6bb-44fd-9022-6b28e53f1175 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.603 187518 DEBUG nova.network.neutron [req-89f616e0-76c0-4dcf-b36c-dbbe1de9e8ce req-de2c03ca-d6bb-44fd-9022-6b28e53f1175 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing network info cache for port 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.608 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Start _get_guest_xml network_info=[{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.614 187518 WARNING nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.629 187518 DEBUG nova.virt.libvirt.host [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.630 187518 DEBUG nova.virt.libvirt.host [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.634 187518 DEBUG nova.virt.libvirt.host [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.635 187518 DEBUG nova.virt.libvirt.host [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.636 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.636 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.637 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.638 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.638 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.639 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.639 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.639 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.640 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.640 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.641 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.641 187518 DEBUG nova.virt.hardware [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.647 187518 DEBUG nova.virt.libvirt.vif [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:57:58Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.648 187518 DEBUG nova.network.os_vif_util [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.649 187518 DEBUG nova.network.os_vif_util [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:68:b1,bridge_name='br-int',has_traffic_filtering=True,id=619a3ed2-fa55-4d60-8e37-9fd4ff488e12,network=Network(772dc02e-f97e-4f35-bbad-0f0f22357164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap619a3ed2-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.651 187518 DEBUG nova.objects.instance [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.671 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] End _get_guest_xml xml=<domain type="kvm">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <uuid>cdaab479-3862-458b-b200-b443c1647c78</uuid>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <name>instance-00000006</name>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-2097217943</nova:name>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 00:58:02</nova:creationTime>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        <nova:port uuid="619a3ed2-fa55-4d60-8e37-9fd4ff488e12">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <system>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <entry name="serial">cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <entry name="uuid">cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </system>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <os>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  </clock>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.config"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:0e:68:b1"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <target dev="tap619a3ed2-fa"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log" append="off"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </serial>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <video>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 19:58:02 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 19:58:02 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:58:02 np0005539279 nova_compute[187514]: </domain>
Nov 28 19:58:02 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.673 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Preparing to wait for external event network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.674 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.674 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.675 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.676 187518 DEBUG nova.virt.libvirt.vif [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:57:58Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.676 187518 DEBUG nova.network.os_vif_util [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.677 187518 DEBUG nova.network.os_vif_util [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:68:b1,bridge_name='br-int',has_traffic_filtering=True,id=619a3ed2-fa55-4d60-8e37-9fd4ff488e12,network=Network(772dc02e-f97e-4f35-bbad-0f0f22357164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap619a3ed2-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.678 187518 DEBUG os_vif [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:68:b1,bridge_name='br-int',has_traffic_filtering=True,id=619a3ed2-fa55-4d60-8e37-9fd4ff488e12,network=Network(772dc02e-f97e-4f35-bbad-0f0f22357164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap619a3ed2-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.679 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.679 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.680 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.684 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.685 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap619a3ed2-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.686 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap619a3ed2-fa, col_values=(('external_ids', {'iface-id': '619a3ed2-fa55-4d60-8e37-9fd4ff488e12', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:68:b1', 'vm-uuid': 'cdaab479-3862-458b-b200-b443c1647c78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:02 np0005539279 NetworkManager[55703]: <info>  [1764377882.7316] manager: (tap619a3ed2-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.730 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.736 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.741 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.742 187518 INFO os_vif [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:68:b1,bridge_name='br-int',has_traffic_filtering=True,id=619a3ed2-fa55-4d60-8e37-9fd4ff488e12,network=Network(772dc02e-f97e-4f35-bbad-0f0f22357164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap619a3ed2-fa')#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.807 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.808 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.809 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:0e:68:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:58:02 np0005539279 nova_compute[187514]: 2025-11-29 00:58:02.810 187518 INFO nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Using config drive#033[00m
Nov 28 19:58:03 np0005539279 nova_compute[187514]: 2025-11-29 00:58:03.729 187518 INFO nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Creating config drive at /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.config#033[00m
Nov 28 19:58:03 np0005539279 nova_compute[187514]: 2025-11-29 00:58:03.734 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpla4uvcd6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:03 np0005539279 nova_compute[187514]: 2025-11-29 00:58:03.875 187518 DEBUG oslo_concurrency.processutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpla4uvcd6" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:03 np0005539279 NetworkManager[55703]: <info>  [1764377883.9569] manager: (tap619a3ed2-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Nov 28 19:58:03 np0005539279 kernel: tap619a3ed2-fa: entered promiscuous mode
Nov 28 19:58:03 np0005539279 nova_compute[187514]: 2025-11-29 00:58:03.961 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:03 np0005539279 nova_compute[187514]: 2025-11-29 00:58:03.967 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:03 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:03Z|00091|binding|INFO|Claiming lport 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 for this chassis.
Nov 28 19:58:03 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:03Z|00092|binding|INFO|619a3ed2-fa55-4d60-8e37-9fd4ff488e12: Claiming fa:16:3e:0e:68:b1 10.100.0.4
Nov 28 19:58:03 np0005539279 nova_compute[187514]: 2025-11-29 00:58:03.974 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:03 np0005539279 nova_compute[187514]: 2025-11-29 00:58:03.978 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:03 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:03.986 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:68:b1 10.100.0.4'], port_security=['fa:16:3e:0e:68:b1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-772dc02e-f97e-4f35-bbad-0f0f22357164', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c188716c-76b9-447e-b8e8-521d447349a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef1e45ce-2254-4684-b7ba-97523ff379ec, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=619a3ed2-fa55-4d60-8e37-9fd4ff488e12) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:58:03 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:03.988 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 in datapath 772dc02e-f97e-4f35-bbad-0f0f22357164 bound to our chassis#033[00m
Nov 28 19:58:03 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:03.990 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 772dc02e-f97e-4f35-bbad-0f0f22357164#033[00m
Nov 28 19:58:04 np0005539279 systemd-machined[153752]: New machine qemu-6-instance-00000006.
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.015 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b9e832cf-9d78-48a8-9861-9574f0b54444]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.017 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap772dc02e-f1 in ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.019 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap772dc02e-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.019 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[262aca86-1ce9-4a28-8e24-a0ba456cff84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.020 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[640c6f74-aabb-4409-bce8-14b18ab522ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:04Z|00093|binding|INFO|Setting lport 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 ovn-installed in OVS
Nov 28 19:58:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:04Z|00094|binding|INFO|Setting lport 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 up in Southbound
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.030 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.033 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[98db2b3d-ac64-4568-8318-86edb8077f07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Nov 28 19:58:04 np0005539279 systemd-udevd[216591]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.061 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[3221b1bf-cd59-4e51-b7da-819729ef6c3a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 NetworkManager[55703]: <info>  [1764377884.0715] device (tap619a3ed2-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:58:04 np0005539279 NetworkManager[55703]: <info>  [1764377884.0728] device (tap619a3ed2-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.103 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[b03571c0-f19e-4dd4-9e23-2173c457f4ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 NetworkManager[55703]: <info>  [1764377884.1131] manager: (tap772dc02e-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.112 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[765e330c-7503-43e7-959f-36445e9076d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.158 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[a6bd8da9-5537-44fa-af10-082567ba536c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.163 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8c93a1-5a9d-4b4d-b3a9-d9172060d35e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 NetworkManager[55703]: <info>  [1764377884.2069] device (tap772dc02e-f0): carrier: link connected
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.217 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[7f0fe47f-90c5-4a03-8ae9-7371d661514a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.247 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[225d9de6-21a9-405c-987b-ced46ada1a0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap772dc02e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:d1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381791, 'reachable_time': 39067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216623, 'error': None, 'target': 'ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.276 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[d96f0ecd-1b14-4f7e-b553-d5914dccd903]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe80:d198'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 381791, 'tstamp': 381791}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216624, 'error': None, 'target': 'ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.322 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[4786f891-22df-4e6d-a36d-1f54f17df36d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap772dc02e-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:80:d1:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381791, 'reachable_time': 39067, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216628, 'error': None, 'target': 'ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.386 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[be8a9bf5-fc02-4f70-a4cb-12b6706b0cb8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.392 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377884.3907058, cdaab479-3862-458b-b200-b443c1647c78 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.392 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] VM Started (Lifecycle Event)#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.422 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.428 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377884.3916867, cdaab479-3862-458b-b200-b443c1647c78 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.428 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] VM Paused (Lifecycle Event)#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.455 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.460 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.478 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.489 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[42ab02f2-62a2-43b5-807f-0713ffa1f7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.491 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap772dc02e-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.491 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.492 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap772dc02e-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.494 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:04 np0005539279 NetworkManager[55703]: <info>  [1764377884.4958] manager: (tap772dc02e-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 28 19:58:04 np0005539279 kernel: tap772dc02e-f0: entered promiscuous mode
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.499 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.501 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap772dc02e-f0, col_values=(('external_ids', {'iface-id': '1e718df3-210b-4d88-80e9-df977e4844c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:04Z|00095|binding|INFO|Releasing lport 1e718df3-210b-4d88-80e9-df977e4844c7 from this chassis (sb_readonly=0)
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.503 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.505 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/772dc02e-f97e-4f35-bbad-0f0f22357164.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/772dc02e-f97e-4f35-bbad-0f0f22357164.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.507 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[865de566-6ce1-41f3-ae12-09a7079d1671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.508 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-772dc02e-f97e-4f35-bbad-0f0f22357164
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/772dc02e-f97e-4f35-bbad-0f0f22357164.pid.haproxy
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 772dc02e-f97e-4f35-bbad-0f0f22357164
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 19:58:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:04.509 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164', 'env', 'PROCESS_TAG=haproxy-772dc02e-f97e-4f35-bbad-0f0f22357164', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/772dc02e-f97e-4f35-bbad-0f0f22357164.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.528 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.600 187518 DEBUG nova.compute.manager [req-9f0c3784-e6ec-4c92-ba52-93ce90bf3d5d req-b3e5f1f2-a8d7-4a3f-99ca-8a16404d467a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.601 187518 DEBUG oslo_concurrency.lockutils [req-9f0c3784-e6ec-4c92-ba52-93ce90bf3d5d req-b3e5f1f2-a8d7-4a3f-99ca-8a16404d467a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.602 187518 DEBUG oslo_concurrency.lockutils [req-9f0c3784-e6ec-4c92-ba52-93ce90bf3d5d req-b3e5f1f2-a8d7-4a3f-99ca-8a16404d467a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.602 187518 DEBUG oslo_concurrency.lockutils [req-9f0c3784-e6ec-4c92-ba52-93ce90bf3d5d req-b3e5f1f2-a8d7-4a3f-99ca-8a16404d467a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.603 187518 DEBUG nova.compute.manager [req-9f0c3784-e6ec-4c92-ba52-93ce90bf3d5d req-b3e5f1f2-a8d7-4a3f-99ca-8a16404d467a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Processing event network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.604 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.609 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377884.6095078, cdaab479-3862-458b-b200-b443c1647c78 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.610 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] VM Resumed (Lifecycle Event)#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.613 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.618 187518 INFO nova.virt.libvirt.driver [-] [instance: cdaab479-3862-458b-b200-b443c1647c78] Instance spawned successfully.#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.619 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.631 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.640 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.647 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.648 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.649 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.649 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.650 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.651 187518 DEBUG nova.virt.libvirt.driver [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.665 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.719 187518 INFO nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Took 6.65 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.720 187518 DEBUG nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.791 187518 INFO nova.compute.manager [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Took 7.26 seconds to build instance.#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.810 187518 DEBUG oslo_concurrency.lockutils [None req-c32458d1-f1a5-49bf-ba21-91c801185bbf 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.914 187518 DEBUG nova.network.neutron [req-89f616e0-76c0-4dcf-b36c-dbbe1de9e8ce req-de2c03ca-d6bb-44fd-9022-6b28e53f1175 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updated VIF entry in instance network info cache for port 619a3ed2-fa55-4d60-8e37-9fd4ff488e12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.915 187518 DEBUG nova.network.neutron [req-89f616e0-76c0-4dcf-b36c-dbbe1de9e8ce req-de2c03ca-d6bb-44fd-9022-6b28e53f1175 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:58:04 np0005539279 nova_compute[187514]: 2025-11-29 00:58:04.936 187518 DEBUG oslo_concurrency.lockutils [req-89f616e0-76c0-4dcf-b36c-dbbe1de9e8ce req-de2c03ca-d6bb-44fd-9022-6b28e53f1175 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:58:04 np0005539279 podman[216663]: 2025-11-29 00:58:04.975010999 +0000 UTC m=+0.081271350 container create b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:58:05 np0005539279 systemd[1]: Started libpod-conmon-b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe.scope.
Nov 28 19:58:05 np0005539279 podman[216663]: 2025-11-29 00:58:04.9323727 +0000 UTC m=+0.038633111 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:58:05 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:58:05 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c2cae2d4a01427ecee6ddaeb52092a8f9fc436a86400ce83c23f4f4bbcbe7a61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 19:58:05 np0005539279 podman[216663]: 2025-11-29 00:58:05.09573607 +0000 UTC m=+0.201996471 container init b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 19:58:05 np0005539279 podman[216663]: 2025-11-29 00:58:05.101115757 +0000 UTC m=+0.207376108 container start b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 19:58:05 np0005539279 neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164[216678]: [NOTICE]   (216682) : New worker (216684) forked
Nov 28 19:58:05 np0005539279 neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164[216678]: [NOTICE]   (216682) : Loading success.
Nov 28 19:58:06 np0005539279 nova_compute[187514]: 2025-11-29 00:58:06.750 187518 DEBUG nova.compute.manager [req-c2008e5b-9db4-4ec4-b9cc-d4a5fd94de46 req-dac55c09-c2d6-4baa-99b0-b93ed6f79f86 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:06 np0005539279 nova_compute[187514]: 2025-11-29 00:58:06.751 187518 DEBUG oslo_concurrency.lockutils [req-c2008e5b-9db4-4ec4-b9cc-d4a5fd94de46 req-dac55c09-c2d6-4baa-99b0-b93ed6f79f86 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:06 np0005539279 nova_compute[187514]: 2025-11-29 00:58:06.752 187518 DEBUG oslo_concurrency.lockutils [req-c2008e5b-9db4-4ec4-b9cc-d4a5fd94de46 req-dac55c09-c2d6-4baa-99b0-b93ed6f79f86 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:06 np0005539279 nova_compute[187514]: 2025-11-29 00:58:06.752 187518 DEBUG oslo_concurrency.lockutils [req-c2008e5b-9db4-4ec4-b9cc-d4a5fd94de46 req-dac55c09-c2d6-4baa-99b0-b93ed6f79f86 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:06 np0005539279 nova_compute[187514]: 2025-11-29 00:58:06.753 187518 DEBUG nova.compute.manager [req-c2008e5b-9db4-4ec4-b9cc-d4a5fd94de46 req-dac55c09-c2d6-4baa-99b0-b93ed6f79f86 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] No waiting events found dispatching network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:58:06 np0005539279 nova_compute[187514]: 2025-11-29 00:58:06.753 187518 WARNING nova.compute.manager [req-c2008e5b-9db4-4ec4-b9cc-d4a5fd94de46 req-dac55c09-c2d6-4baa-99b0-b93ed6f79f86 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received unexpected event network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:58:07 np0005539279 nova_compute[187514]: 2025-11-29 00:58:07.452 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:07 np0005539279 nova_compute[187514]: 2025-11-29 00:58:07.768 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:08.092 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:08.093 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:08.094 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:10 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:10Z|00096|binding|INFO|Releasing lport 1e718df3-210b-4d88-80e9-df977e4844c7 from this chassis (sb_readonly=0)
Nov 28 19:58:10 np0005539279 NetworkManager[55703]: <info>  [1764377890.7502] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Nov 28 19:58:10 np0005539279 NetworkManager[55703]: <info>  [1764377890.7514] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 28 19:58:10 np0005539279 nova_compute[187514]: 2025-11-29 00:58:10.750 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:10 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:10Z|00097|binding|INFO|Releasing lport 1e718df3-210b-4d88-80e9-df977e4844c7 from this chassis (sb_readonly=0)
Nov 28 19:58:10 np0005539279 nova_compute[187514]: 2025-11-29 00:58:10.809 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:10 np0005539279 nova_compute[187514]: 2025-11-29 00:58:10.815 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:11 np0005539279 nova_compute[187514]: 2025-11-29 00:58:11.045 187518 DEBUG nova.compute.manager [req-4fc0bd5c-c190-4e65-96fb-c1ccccbdd2eb req-fc5a3dbc-30e8-4466-a535-9804c13bc9ed 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-changed-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:11 np0005539279 nova_compute[187514]: 2025-11-29 00:58:11.046 187518 DEBUG nova.compute.manager [req-4fc0bd5c-c190-4e65-96fb-c1ccccbdd2eb req-fc5a3dbc-30e8-4466-a535-9804c13bc9ed 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing instance network info cache due to event network-changed-619a3ed2-fa55-4d60-8e37-9fd4ff488e12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:58:11 np0005539279 nova_compute[187514]: 2025-11-29 00:58:11.046 187518 DEBUG oslo_concurrency.lockutils [req-4fc0bd5c-c190-4e65-96fb-c1ccccbdd2eb req-fc5a3dbc-30e8-4466-a535-9804c13bc9ed 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:58:11 np0005539279 nova_compute[187514]: 2025-11-29 00:58:11.046 187518 DEBUG oslo_concurrency.lockutils [req-4fc0bd5c-c190-4e65-96fb-c1ccccbdd2eb req-fc5a3dbc-30e8-4466-a535-9804c13bc9ed 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:58:11 np0005539279 nova_compute[187514]: 2025-11-29 00:58:11.047 187518 DEBUG nova.network.neutron [req-4fc0bd5c-c190-4e65-96fb-c1ccccbdd2eb req-fc5a3dbc-30e8-4466-a535-9804c13bc9ed 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing network info cache for port 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:58:12 np0005539279 nova_compute[187514]: 2025-11-29 00:58:12.291 187518 DEBUG nova.network.neutron [req-4fc0bd5c-c190-4e65-96fb-c1ccccbdd2eb req-fc5a3dbc-30e8-4466-a535-9804c13bc9ed 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updated VIF entry in instance network info cache for port 619a3ed2-fa55-4d60-8e37-9fd4ff488e12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:58:12 np0005539279 nova_compute[187514]: 2025-11-29 00:58:12.292 187518 DEBUG nova.network.neutron [req-4fc0bd5c-c190-4e65-96fb-c1ccccbdd2eb req-fc5a3dbc-30e8-4466-a535-9804c13bc9ed 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:58:12 np0005539279 nova_compute[187514]: 2025-11-29 00:58:12.316 187518 DEBUG oslo_concurrency.lockutils [req-4fc0bd5c-c190-4e65-96fb-c1ccccbdd2eb req-fc5a3dbc-30e8-4466-a535-9804c13bc9ed 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:58:12 np0005539279 nova_compute[187514]: 2025-11-29 00:58:12.453 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:12 np0005539279 nova_compute[187514]: 2025-11-29 00:58:12.809 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:12 np0005539279 podman[216698]: 2025-11-29 00:58:12.872140699 +0000 UTC m=+0.099018002 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:58:12 np0005539279 podman[216699]: 2025-11-29 00:58:12.906549314 +0000 UTC m=+0.132668667 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Nov 28 19:58:13 np0005539279 podman[216739]: 2025-11-29 00:58:13.011035314 +0000 UTC m=+0.099864405 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm)
Nov 28 19:58:15 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:15Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:68:b1 10.100.0.4
Nov 28 19:58:15 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:15Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:68:b1 10.100.0.4
Nov 28 19:58:16 np0005539279 nova_compute[187514]: 2025-11-29 00:58:16.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:16 np0005539279 nova_compute[187514]: 2025-11-29 00:58:16.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 19:58:16 np0005539279 nova_compute[187514]: 2025-11-29 00:58:16.649 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 19:58:17 np0005539279 nova_compute[187514]: 2025-11-29 00:58:17.455 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:17 np0005539279 nova_compute[187514]: 2025-11-29 00:58:17.812 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:20 np0005539279 nova_compute[187514]: 2025-11-29 00:58:20.650 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:21 np0005539279 nova_compute[187514]: 2025-11-29 00:58:21.452 187518 INFO nova.compute.manager [None req-04932e24-f663-450d-9d1c-b2cd7c11f1e7 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Get console output#033[00m
Nov 28 19:58:21 np0005539279 nova_compute[187514]: 2025-11-29 00:58:21.461 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 19:58:21 np0005539279 nova_compute[187514]: 2025-11-29 00:58:21.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:21 np0005539279 nova_compute[187514]: 2025-11-29 00:58:21.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.457 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.629 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.630 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.630 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.631 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.670 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.671 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.671 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.672 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.785 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.861 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.878 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.879 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:22 np0005539279 nova_compute[187514]: 2025-11-29 00:58:22.968 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.194 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.196 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5618MB free_disk=73.31011581420898GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.196 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.197 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.372 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Instance cdaab479-3862-458b-b200-b443c1647c78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.373 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.373 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.559 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.600 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.631 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:58:23 np0005539279 nova_compute[187514]: 2025-11-29 00:58:23.631 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:24 np0005539279 nova_compute[187514]: 2025-11-29 00:58:24.611 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:24 np0005539279 nova_compute[187514]: 2025-11-29 00:58:24.612 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:58:24 np0005539279 nova_compute[187514]: 2025-11-29 00:58:24.651 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 19:58:25 np0005539279 nova_compute[187514]: 2025-11-29 00:58:25.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:25 np0005539279 nova_compute[187514]: 2025-11-29 00:58:25.629 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:25 np0005539279 podman[216785]: 2025-11-29 00:58:25.854798827 +0000 UTC m=+0.083875000 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:58:25 np0005539279 podman[216784]: 2025-11-29 00:58:25.871836921 +0000 UTC m=+0.097436960 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, distribution-scope=public, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Nov 28 19:58:26 np0005539279 nova_compute[187514]: 2025-11-29 00:58:26.108 187518 DEBUG oslo_concurrency.lockutils [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "interface-cdaab479-3862-458b-b200-b443c1647c78-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:26 np0005539279 nova_compute[187514]: 2025-11-29 00:58:26.109 187518 DEBUG oslo_concurrency.lockutils [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "interface-cdaab479-3862-458b-b200-b443c1647c78-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:26 np0005539279 nova_compute[187514]: 2025-11-29 00:58:26.110 187518 DEBUG nova.objects.instance [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'flavor' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:58:26 np0005539279 nova_compute[187514]: 2025-11-29 00:58:26.540 187518 DEBUG nova.objects.instance [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_requests' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:58:26 np0005539279 nova_compute[187514]: 2025-11-29 00:58:26.557 187518 DEBUG nova.network.neutron [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:58:26 np0005539279 nova_compute[187514]: 2025-11-29 00:58:26.795 187518 DEBUG nova.policy [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:58:27 np0005539279 nova_compute[187514]: 2025-11-29 00:58:27.334 187518 DEBUG nova.network.neutron [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Successfully created port: 04107db0-1e00-49d9-8888-dd071f790f24 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:58:27 np0005539279 nova_compute[187514]: 2025-11-29 00:58:27.461 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:27 np0005539279 nova_compute[187514]: 2025-11-29 00:58:27.619 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:27 np0005539279 nova_compute[187514]: 2025-11-29 00:58:27.620 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:27 np0005539279 nova_compute[187514]: 2025-11-29 00:58:27.620 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:27 np0005539279 nova_compute[187514]: 2025-11-29 00:58:27.912 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:28 np0005539279 nova_compute[187514]: 2025-11-29 00:58:28.199 187518 DEBUG nova.network.neutron [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Successfully updated port: 04107db0-1e00-49d9-8888-dd071f790f24 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:58:28 np0005539279 nova_compute[187514]: 2025-11-29 00:58:28.215 187518 DEBUG oslo_concurrency.lockutils [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:58:28 np0005539279 nova_compute[187514]: 2025-11-29 00:58:28.216 187518 DEBUG oslo_concurrency.lockutils [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:58:28 np0005539279 nova_compute[187514]: 2025-11-29 00:58:28.216 187518 DEBUG nova.network.neutron [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:58:28 np0005539279 nova_compute[187514]: 2025-11-29 00:58:28.332 187518 DEBUG nova.compute.manager [req-7ae87049-f86d-43e1-8c9d-2d10c4e8e507 req-8a3002ac-b988-48b2-b4a8-878d0b364ebc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-changed-04107db0-1e00-49d9-8888-dd071f790f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:28 np0005539279 nova_compute[187514]: 2025-11-29 00:58:28.333 187518 DEBUG nova.compute.manager [req-7ae87049-f86d-43e1-8c9d-2d10c4e8e507 req-8a3002ac-b988-48b2-b4a8-878d0b364ebc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing instance network info cache due to event network-changed-04107db0-1e00-49d9-8888-dd071f790f24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:58:28 np0005539279 nova_compute[187514]: 2025-11-29 00:58:28.334 187518 DEBUG oslo_concurrency.lockutils [req-7ae87049-f86d-43e1-8c9d-2d10c4e8e507 req-8a3002ac-b988-48b2-b4a8-878d0b364ebc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:58:28 np0005539279 nova_compute[187514]: 2025-11-29 00:58:28.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:58:28 np0005539279 podman[216830]: 2025-11-29 00:58:28.849247729 +0000 UTC m=+0.088467146 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 19:58:28 np0005539279 podman[216829]: 2025-11-29 00:58:28.894612242 +0000 UTC m=+0.132320528 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.463 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.631 187518 DEBUG nova.network.neutron [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.652 187518 DEBUG oslo_concurrency.lockutils [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.653 187518 DEBUG oslo_concurrency.lockutils [req-7ae87049-f86d-43e1-8c9d-2d10c4e8e507 req-8a3002ac-b988-48b2-b4a8-878d0b364ebc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.653 187518 DEBUG nova.network.neutron [req-7ae87049-f86d-43e1-8c9d-2d10c4e8e507 req-8a3002ac-b988-48b2-b4a8-878d0b364ebc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing network info cache for port 04107db0-1e00-49d9-8888-dd071f790f24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.658 187518 DEBUG nova.virt.libvirt.vif [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:58:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:58:04Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.658 187518 DEBUG nova.network.os_vif_util [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.659 187518 DEBUG nova.network.os_vif_util [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.659 187518 DEBUG os_vif [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.660 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.660 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.661 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.665 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.666 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04107db0-1e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.666 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04107db0-1e, col_values=(('external_ids', {'iface-id': '04107db0-1e00-49d9-8888-dd071f790f24', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:7c:e0', 'vm-uuid': 'cdaab479-3862-458b-b200-b443c1647c78'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:32 np0005539279 NetworkManager[55703]: <info>  [1764377912.6693] manager: (tap04107db0-1e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.673 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.674 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.730 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.732 187518 INFO os_vif [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e')#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.733 187518 DEBUG nova.virt.libvirt.vif [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:58:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:58:04Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.734 187518 DEBUG nova.network.os_vif_util [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.734 187518 DEBUG nova.network.os_vif_util [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.740 187518 DEBUG nova.virt.libvirt.guest [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] attach device xml: <interface type="ethernet">
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <mac address="fa:16:3e:9f:7c:e0"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <model type="virtio"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <mtu size="1442"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <target dev="tap04107db0-1e"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]: </interface>
Nov 28 19:58:32 np0005539279 nova_compute[187514]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 28 19:58:32 np0005539279 kernel: tap04107db0-1e: entered promiscuous mode
Nov 28 19:58:32 np0005539279 NetworkManager[55703]: <info>  [1764377912.7580] manager: (tap04107db0-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Nov 28 19:58:32 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:32Z|00098|binding|INFO|Claiming lport 04107db0-1e00-49d9-8888-dd071f790f24 for this chassis.
Nov 28 19:58:32 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:32Z|00099|binding|INFO|04107db0-1e00-49d9-8888-dd071f790f24: Claiming fa:16:3e:9f:7c:e0 10.100.0.19
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.759 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.764 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.773 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:7c:e0 10.100.0.19'], port_security=['fa:16:3e:9f:7c:e0 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beb28e65-81a9-4c61-962b-bcd4d536483d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9bd29a3-ae46-41d8-aaea-3325e1bc2031', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d853b3f-840e-4869-bcf7-a67e0ea8364c, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=04107db0-1e00-49d9-8888-dd071f790f24) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.775 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 04107db0-1e00-49d9-8888-dd071f790f24 in datapath beb28e65-81a9-4c61-962b-bcd4d536483d bound to our chassis#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.777 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network beb28e65-81a9-4c61-962b-bcd4d536483d#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.794 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[8b2704db-c58d-47f2-9bef-18e8ac182bba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.795 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbeb28e65-81 in ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.797 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbeb28e65-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.798 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[de402c67-b272-42cd-8d93-7791526addf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.797 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.799 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c24306e1-3ddf-439f-a078-8b4a75fbefb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:32Z|00100|binding|INFO|Setting lport 04107db0-1e00-49d9-8888-dd071f790f24 ovn-installed in OVS
Nov 28 19:58:32 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:32Z|00101|binding|INFO|Setting lport 04107db0-1e00-49d9-8888-dd071f790f24 up in Southbound
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.802 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:32 np0005539279 systemd-udevd[216883]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.817 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae71c11-f28f-4452-bb52-9705738e28a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.832 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[d0981830-a1e7-412d-b5b5-4c2cfc3c059c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 NetworkManager[55703]: <info>  [1764377912.8423] device (tap04107db0-1e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:58:32 np0005539279 NetworkManager[55703]: <info>  [1764377912.8433] device (tap04107db0-1e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.869 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[de99687e-b95b-4b17-80fa-ed6be0efe9d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.875 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[7323d735-67af-43cc-9b78-df07b8aa8381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 NetworkManager[55703]: <info>  [1764377912.8769] manager: (tapbeb28e65-80): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Nov 28 19:58:32 np0005539279 systemd-udevd[216887]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.883 187518 DEBUG nova.virt.libvirt.driver [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.884 187518 DEBUG nova.virt.libvirt.driver [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.884 187518 DEBUG nova.virt.libvirt.driver [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:0e:68:b1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.884 187518 DEBUG nova.virt.libvirt.driver [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:9f:7c:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.912 187518 DEBUG nova.virt.libvirt.guest [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <nova:name>tempest-TestNetworkBasicOps-server-2097217943</nova:name>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <nova:creationTime>2025-11-29 00:58:32</nova:creationTime>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <nova:flavor name="m1.nano">
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:memory>128</nova:memory>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:disk>1</nova:disk>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:swap>0</nova:swap>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:vcpus>1</nova:vcpus>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  </nova:flavor>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <nova:owner>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  </nova:owner>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  <nova:ports>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:port uuid="619a3ed2-fa55-4d60-8e37-9fd4ff488e12">
Nov 28 19:58:32 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    <nova:port uuid="04107db0-1e00-49d9-8888-dd071f790f24">
Nov 28 19:58:32 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 19:58:32 np0005539279 nova_compute[187514]:  </nova:ports>
Nov 28 19:58:32 np0005539279 nova_compute[187514]: </nova:instance>
Nov 28 19:58:32 np0005539279 nova_compute[187514]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.921 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb7d97b-fb78-4774-836e-655b8231965f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.926 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[f9164083-9430-43f1-bda9-e83acb439ca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 nova_compute[187514]: 2025-11-29 00:58:32.940 187518 DEBUG oslo_concurrency.lockutils [None req-9e6674b2-d254-40aa-8ed8-1af3da8e9cdb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "interface-cdaab479-3862-458b-b200-b443c1647c78-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:32 np0005539279 NetworkManager[55703]: <info>  [1764377912.9585] device (tapbeb28e65-80): carrier: link connected
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.965 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[33d942c4-a11b-4166-b489-76978de2aecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:32.995 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[567424ea-cae8-4693-914c-06c9eae27078]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbeb28e65-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:4c:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384666, 'reachable_time': 21671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216908, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.018 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[be96f313-7911-4312-8442-4770a132bed8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:4c16'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384666, 'tstamp': 384666}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216909, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.045 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[39b69378-7ef9-4385-be68-06eb2797582b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbeb28e65-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:4c:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384666, 'reachable_time': 21671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216910, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.089 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[f5b5c157-e85c-423e-8328-9118f4d333e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.182 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[a127884d-8c4a-4b23-b74b-33d33d8d8e85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.185 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbeb28e65-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.186 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.186 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbeb28e65-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:33 np0005539279 kernel: tapbeb28e65-80: entered promiscuous mode
Nov 28 19:58:33 np0005539279 NetworkManager[55703]: <info>  [1764377913.1894] manager: (tapbeb28e65-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.188 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.194 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbeb28e65-80, col_values=(('external_ids', {'iface-id': '8657db2e-0ad1-471b-be5c-ea510f417caf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.195 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:33 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:33Z|00102|binding|INFO|Releasing lport 8657db2e-0ad1-471b-be5c-ea510f417caf from this chassis (sb_readonly=0)
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.197 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.197 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/beb28e65-81a9-4c61-962b-bcd4d536483d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/beb28e65-81a9-4c61-962b-bcd4d536483d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.198 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[777188b7-7663-45f4-8fd1-ee4773971202]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.200 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-beb28e65-81a9-4c61-962b-bcd4d536483d
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/beb28e65-81a9-4c61-962b-bcd4d536483d.pid.haproxy
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID beb28e65-81a9-4c61-962b-bcd4d536483d
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 19:58:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:33.201 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'env', 'PROCESS_TAG=haproxy-beb28e65-81a9-4c61-962b-bcd4d536483d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/beb28e65-81a9-4c61-962b-bcd4d536483d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.211 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.550 187518 DEBUG nova.compute.manager [req-7f43d80d-b372-4cb8-869c-a82e1271281f req-45d4d51d-4c87-4f24-80bb-b8d5f3356c07 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.552 187518 DEBUG oslo_concurrency.lockutils [req-7f43d80d-b372-4cb8-869c-a82e1271281f req-45d4d51d-4c87-4f24-80bb-b8d5f3356c07 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.552 187518 DEBUG oslo_concurrency.lockutils [req-7f43d80d-b372-4cb8-869c-a82e1271281f req-45d4d51d-4c87-4f24-80bb-b8d5f3356c07 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.552 187518 DEBUG oslo_concurrency.lockutils [req-7f43d80d-b372-4cb8-869c-a82e1271281f req-45d4d51d-4c87-4f24-80bb-b8d5f3356c07 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.552 187518 DEBUG nova.compute.manager [req-7f43d80d-b372-4cb8-869c-a82e1271281f req-45d4d51d-4c87-4f24-80bb-b8d5f3356c07 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] No waiting events found dispatching network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:58:33 np0005539279 nova_compute[187514]: 2025-11-29 00:58:33.552 187518 WARNING nova.compute.manager [req-7f43d80d-b372-4cb8-869c-a82e1271281f req-45d4d51d-4c87-4f24-80bb-b8d5f3356c07 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received unexpected event network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:58:33 np0005539279 podman[216942]: 2025-11-29 00:58:33.624610035 +0000 UTC m=+0.060305500 container create e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 19:58:33 np0005539279 systemd[1]: Started libpod-conmon-e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8.scope.
Nov 28 19:58:33 np0005539279 podman[216942]: 2025-11-29 00:58:33.596996465 +0000 UTC m=+0.032691910 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 19:58:33 np0005539279 systemd[1]: Started libcrun container.
Nov 28 19:58:33 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cfbea9a12fc84277159a868b18d1d707bd27a2b5bd58affab360cdaf72e4f92e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 19:58:33 np0005539279 podman[216942]: 2025-11-29 00:58:33.752583804 +0000 UTC m=+0.188279279 container init e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 19:58:33 np0005539279 podman[216942]: 2025-11-29 00:58:33.76238373 +0000 UTC m=+0.198079185 container start e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 19:58:33 np0005539279 neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d[216958]: [NOTICE]   (216962) : New worker (216964) forked
Nov 28 19:58:33 np0005539279 neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d[216958]: [NOTICE]   (216962) : Loading success.
Nov 28 19:58:34 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:34Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:7c:e0 10.100.0.19
Nov 28 19:58:34 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:34Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:7c:e0 10.100.0.19
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.423 187518 DEBUG nova.network.neutron [req-7ae87049-f86d-43e1-8c9d-2d10c4e8e507 req-8a3002ac-b988-48b2-b4a8-878d0b364ebc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updated VIF entry in instance network info cache for port 04107db0-1e00-49d9-8888-dd071f790f24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.424 187518 DEBUG nova.network.neutron [req-7ae87049-f86d-43e1-8c9d-2d10c4e8e507 req-8a3002ac-b988-48b2-b4a8-878d0b364ebc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.455 187518 DEBUG oslo_concurrency.lockutils [req-7ae87049-f86d-43e1-8c9d-2d10c4e8e507 req-8a3002ac-b988-48b2-b4a8-878d0b364ebc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.664 187518 DEBUG nova.compute.manager [req-2bdae809-81de-4c47-86d6-cca591df0627 req-8b52cd16-27f5-4713-9e40-024c3ee1d46e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.664 187518 DEBUG oslo_concurrency.lockutils [req-2bdae809-81de-4c47-86d6-cca591df0627 req-8b52cd16-27f5-4713-9e40-024c3ee1d46e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.665 187518 DEBUG oslo_concurrency.lockutils [req-2bdae809-81de-4c47-86d6-cca591df0627 req-8b52cd16-27f5-4713-9e40-024c3ee1d46e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.665 187518 DEBUG oslo_concurrency.lockutils [req-2bdae809-81de-4c47-86d6-cca591df0627 req-8b52cd16-27f5-4713-9e40-024c3ee1d46e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.666 187518 DEBUG nova.compute.manager [req-2bdae809-81de-4c47-86d6-cca591df0627 req-8b52cd16-27f5-4713-9e40-024c3ee1d46e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] No waiting events found dispatching network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:58:35 np0005539279 nova_compute[187514]: 2025-11-29 00:58:35.666 187518 WARNING nova.compute.manager [req-2bdae809-81de-4c47-86d6-cca591df0627 req-8b52cd16-27f5-4713-9e40-024c3ee1d46e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received unexpected event network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:58:37 np0005539279 nova_compute[187514]: 2025-11-29 00:58:37.466 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:37 np0005539279 nova_compute[187514]: 2025-11-29 00:58:37.669 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:41 np0005539279 nova_compute[187514]: 2025-11-29 00:58:41.607 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:41 np0005539279 nova_compute[187514]: 2025-11-29 00:58:41.608 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:41 np0005539279 nova_compute[187514]: 2025-11-29 00:58:41.629 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 19:58:41 np0005539279 nova_compute[187514]: 2025-11-29 00:58:41.756 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:41 np0005539279 nova_compute[187514]: 2025-11-29 00:58:41.757 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:41 np0005539279 nova_compute[187514]: 2025-11-29 00:58:41.769 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 19:58:41 np0005539279 nova_compute[187514]: 2025-11-29 00:58:41.770 187518 INFO nova.compute.claims [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 19:58:41 np0005539279 nova_compute[187514]: 2025-11-29 00:58:41.989 187518 DEBUG nova.compute.provider_tree [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.011 187518 DEBUG nova.scheduler.client.report [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.061 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.063 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.138 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.139 187518 DEBUG nova.network.neutron [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.166 187518 INFO nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.203 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.311 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.313 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.314 187518 INFO nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Creating image(s)#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.315 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.315 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.317 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.344 187518 DEBUG nova.policy [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.349 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.441 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.444 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.445 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.470 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.498 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.554 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.555 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.594 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.596 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.596 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.651 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.652 187518 DEBUG nova.virt.disk.api [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.652 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.672 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.706 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.706 187518 DEBUG nova.virt.disk.api [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.707 187518 DEBUG nova.objects.instance [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 375250f0-4625-4017-ac44-e74799c55dbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.724 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.724 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Ensure instance console log exists: /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.725 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.725 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:42 np0005539279 nova_compute[187514]: 2025-11-29 00:58:42.725 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:43 np0005539279 podman[216992]: 2025-11-29 00:58:43.864285718 +0000 UTC m=+0.088333582 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 19:58:43 np0005539279 podman[216993]: 2025-11-29 00:58:43.868090601 +0000 UTC m=+0.089089073 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 28 19:58:43 np0005539279 podman[216991]: 2025-11-29 00:58:43.873704764 +0000 UTC m=+0.101361947 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:58:45 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:45.406 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:58:45 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:45.407 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 19:58:45 np0005539279 nova_compute[187514]: 2025-11-29 00:58:45.441 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:46 np0005539279 nova_compute[187514]: 2025-11-29 00:58:46.355 187518 DEBUG nova.network.neutron [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Successfully created port: 4ce4680e-f578-4ef3-8110-b81c6011ca78 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.056 187518 DEBUG nova.network.neutron [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Successfully updated port: 4ce4680e-f578-4ef3-8110-b81c6011ca78 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.071 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-375250f0-4625-4017-ac44-e74799c55dbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.072 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-375250f0-4625-4017-ac44-e74799c55dbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.072 187518 DEBUG nova.network.neutron [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.159 187518 DEBUG nova.compute.manager [req-0eaa879f-ecd3-4222-a0b7-ffd036088cb3 req-0e56342a-fc7f-4d82-b361-02130b95c660 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received event network-changed-4ce4680e-f578-4ef3-8110-b81c6011ca78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.160 187518 DEBUG nova.compute.manager [req-0eaa879f-ecd3-4222-a0b7-ffd036088cb3 req-0e56342a-fc7f-4d82-b361-02130b95c660 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Refreshing instance network info cache due to event network-changed-4ce4680e-f578-4ef3-8110-b81c6011ca78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.161 187518 DEBUG oslo_concurrency.lockutils [req-0eaa879f-ecd3-4222-a0b7-ffd036088cb3 req-0e56342a-fc7f-4d82-b361-02130b95c660 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-375250f0-4625-4017-ac44-e74799c55dbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.210 187518 DEBUG nova.network.neutron [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.471 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:47 np0005539279 nova_compute[187514]: 2025-11-29 00:58:47.674 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.371 187518 DEBUG nova.network.neutron [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Updating instance_info_cache with network_info: [{"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.413 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-375250f0-4625-4017-ac44-e74799c55dbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.414 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Instance network_info: |[{"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.415 187518 DEBUG oslo_concurrency.lockutils [req-0eaa879f-ecd3-4222-a0b7-ffd036088cb3 req-0e56342a-fc7f-4d82-b361-02130b95c660 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-375250f0-4625-4017-ac44-e74799c55dbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.416 187518 DEBUG nova.network.neutron [req-0eaa879f-ecd3-4222-a0b7-ffd036088cb3 req-0e56342a-fc7f-4d82-b361-02130b95c660 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Refreshing network info cache for port 4ce4680e-f578-4ef3-8110-b81c6011ca78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.423 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Start _get_guest_xml network_info=[{"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.435 187518 WARNING nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.446 187518 DEBUG nova.virt.libvirt.host [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.447 187518 DEBUG nova.virt.libvirt.host [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.482 187518 DEBUG nova.virt.libvirt.host [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.483 187518 DEBUG nova.virt.libvirt.host [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.484 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.485 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.485 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.486 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.486 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.487 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.487 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.488 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.488 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.489 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.489 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.490 187518 DEBUG nova.virt.hardware [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.497 187518 DEBUG nova.virt.libvirt.vif [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-789600378',display_name='tempest-TestNetworkBasicOps-server-789600378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-789600378',id=7,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKqxNWaE/K8iXOUK4SxlB5irt2bYuR4n+JtaAN+FnQAgYJ2yndSRSNkEbwzuHQdRKn+8qNjOtQlC+1yxvPDcocUJM6LWl4jABxMpf5CHJW7UphTphp1DuQz5nJcotlpFOg==',key_name='tempest-TestNetworkBasicOps-2086077250',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-dm6z0msm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:58:42Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=375250f0-4625-4017-ac44-e74799c55dbf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.498 187518 DEBUG nova.network.os_vif_util [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.500 187518 DEBUG nova.network.os_vif_util [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:f7:d4,bridge_name='br-int',has_traffic_filtering=True,id=4ce4680e-f578-4ef3-8110-b81c6011ca78,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ce4680e-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.504 187518 DEBUG nova.objects.instance [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 375250f0-4625-4017-ac44-e74799c55dbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.527 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] End _get_guest_xml xml=<domain type="kvm">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <uuid>375250f0-4625-4017-ac44-e74799c55dbf</uuid>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <name>instance-00000007</name>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-789600378</nova:name>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 00:58:49</nova:creationTime>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        <nova:port uuid="4ce4680e-f578-4ef3-8110-b81c6011ca78">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <system>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <entry name="serial">375250f0-4625-4017-ac44-e74799c55dbf</entry>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <entry name="uuid">375250f0-4625-4017-ac44-e74799c55dbf</entry>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </system>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <os>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  </os>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <features>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  </features>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  </clock>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  <devices>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk.config"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </disk>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:13:f7:d4"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <target dev="tap4ce4680e-f5"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </interface>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/console.log" append="off"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </serial>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <video>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </video>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </rng>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 19:58:49 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 19:58:49 np0005539279 nova_compute[187514]:  </devices>
Nov 28 19:58:49 np0005539279 nova_compute[187514]: </domain>
Nov 28 19:58:49 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.530 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Preparing to wait for external event network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.530 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.531 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.531 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.532 187518 DEBUG nova.virt.libvirt.vif [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T00:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-789600378',display_name='tempest-TestNetworkBasicOps-server-789600378',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-789600378',id=7,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKqxNWaE/K8iXOUK4SxlB5irt2bYuR4n+JtaAN+FnQAgYJ2yndSRSNkEbwzuHQdRKn+8qNjOtQlC+1yxvPDcocUJM6LWl4jABxMpf5CHJW7UphTphp1DuQz5nJcotlpFOg==',key_name='tempest-TestNetworkBasicOps-2086077250',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-dm6z0msm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T00:58:42Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=375250f0-4625-4017-ac44-e74799c55dbf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.532 187518 DEBUG nova.network.os_vif_util [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.533 187518 DEBUG nova.network.os_vif_util [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:f7:d4,bridge_name='br-int',has_traffic_filtering=True,id=4ce4680e-f578-4ef3-8110-b81c6011ca78,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ce4680e-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.534 187518 DEBUG os_vif [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:f7:d4,bridge_name='br-int',has_traffic_filtering=True,id=4ce4680e-f578-4ef3-8110-b81c6011ca78,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ce4680e-f5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.535 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.535 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.536 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.539 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.540 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ce4680e-f5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.541 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ce4680e-f5, col_values=(('external_ids', {'iface-id': '4ce4680e-f578-4ef3-8110-b81c6011ca78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:13:f7:d4', 'vm-uuid': '375250f0-4625-4017-ac44-e74799c55dbf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.543 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:49 np0005539279 NetworkManager[55703]: <info>  [1764377929.5463] manager: (tap4ce4680e-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.546 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.554 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.557 187518 INFO os_vif [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:f7:d4,bridge_name='br-int',has_traffic_filtering=True,id=4ce4680e-f578-4ef3-8110-b81c6011ca78,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ce4680e-f5')#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.623 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.624 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.624 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:13:f7:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 19:58:49 np0005539279 nova_compute[187514]: 2025-11-29 00:58:49.625 187518 INFO nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Using config drive#033[00m
Nov 28 19:58:50 np0005539279 nova_compute[187514]: 2025-11-29 00:58:50.569 187518 INFO nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Creating config drive at /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk.config#033[00m
Nov 28 19:58:50 np0005539279 nova_compute[187514]: 2025-11-29 00:58:50.578 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeur55qk6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:58:50 np0005539279 nova_compute[187514]: 2025-11-29 00:58:50.721 187518 DEBUG oslo_concurrency.processutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpeur55qk6" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:58:50 np0005539279 kernel: tap4ce4680e-f5: entered promiscuous mode
Nov 28 19:58:50 np0005539279 NetworkManager[55703]: <info>  [1764377930.8226] manager: (tap4ce4680e-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Nov 28 19:58:50 np0005539279 nova_compute[187514]: 2025-11-29 00:58:50.825 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:50 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:50Z|00103|binding|INFO|Claiming lport 4ce4680e-f578-4ef3-8110-b81c6011ca78 for this chassis.
Nov 28 19:58:50 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:50Z|00104|binding|INFO|4ce4680e-f578-4ef3-8110-b81c6011ca78: Claiming fa:16:3e:13:f7:d4 10.100.0.27
Nov 28 19:58:50 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:50.854 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:f7:d4 10.100.0.27'], port_security=['fa:16:3e:13:f7:d4 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beb28e65-81a9-4c61-962b-bcd4d536483d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ef49e330-de8e-4b91-bb66-5d9b2f8c3106', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d853b3f-840e-4869-bcf7-a67e0ea8364c, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=4ce4680e-f578-4ef3-8110-b81c6011ca78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 19:58:50 np0005539279 nova_compute[187514]: 2025-11-29 00:58:50.854 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:50 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:50Z|00105|binding|INFO|Setting lport 4ce4680e-f578-4ef3-8110-b81c6011ca78 ovn-installed in OVS
Nov 28 19:58:50 np0005539279 ovn_controller[95686]: 2025-11-29T00:58:50Z|00106|binding|INFO|Setting lport 4ce4680e-f578-4ef3-8110-b81c6011ca78 up in Southbound
Nov 28 19:58:50 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:50.858 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 4ce4680e-f578-4ef3-8110-b81c6011ca78 in datapath beb28e65-81a9-4c61-962b-bcd4d536483d bound to our chassis#033[00m
Nov 28 19:58:50 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:50.861 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network beb28e65-81a9-4c61-962b-bcd4d536483d#033[00m
Nov 28 19:58:50 np0005539279 nova_compute[187514]: 2025-11-29 00:58:50.862 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:50 np0005539279 systemd-udevd[217070]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 19:58:50 np0005539279 systemd-machined[153752]: New machine qemu-7-instance-00000007.
Nov 28 19:58:50 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:50.895 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[32d36e26-2753-41a4-84bc-7a7fb9c3e1e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:50 np0005539279 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Nov 28 19:58:50 np0005539279 NetworkManager[55703]: <info>  [1764377930.9072] device (tap4ce4680e-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 19:58:50 np0005539279 NetworkManager[55703]: <info>  [1764377930.9093] device (tap4ce4680e-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 19:58:50 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:50.943 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[62cec30f-b9f1-483f-861e-a8536ba2b728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:50 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:50.948 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[257f1b22-75a7-40fd-afa6-aa66db8099fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:50 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:50.994 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[4f5cb5d7-c24e-4600-a31d-0517fae64a02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:51.025 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[896114ec-dbd3-4c57-ba0e-d290380d20ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbeb28e65-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:4c:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384666, 'reachable_time': 21671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217084, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:51.055 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c2bcb6cd-3ba7-462c-8170-f6c3be5a5c91]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbeb28e65-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384683, 'tstamp': 384683}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217085, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapbeb28e65-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384687, 'tstamp': 384687}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217085, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 19:58:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:51.058 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbeb28e65-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.061 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.062 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:51.063 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbeb28e65-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:51.064 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:58:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:51.064 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbeb28e65-80, col_values=(('external_ids', {'iface-id': '8657db2e-0ad1-471b-be5c-ea510f417caf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:51 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:51.065 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.273 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377931.2726178, 375250f0-4625-4017-ac44-e74799c55dbf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.273 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] VM Started (Lifecycle Event)#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.302 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.309 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377931.2756004, 375250f0-4625-4017-ac44-e74799c55dbf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.310 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] VM Paused (Lifecycle Event)#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.358 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.364 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.409 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.661 187518 DEBUG nova.compute.manager [req-4b360ef8-727a-40df-ac35-5963c45065d0 req-15c6fa83-2360-4db6-967c-633f7172e46b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received event network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.662 187518 DEBUG oslo_concurrency.lockutils [req-4b360ef8-727a-40df-ac35-5963c45065d0 req-15c6fa83-2360-4db6-967c-633f7172e46b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.663 187518 DEBUG oslo_concurrency.lockutils [req-4b360ef8-727a-40df-ac35-5963c45065d0 req-15c6fa83-2360-4db6-967c-633f7172e46b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.663 187518 DEBUG oslo_concurrency.lockutils [req-4b360ef8-727a-40df-ac35-5963c45065d0 req-15c6fa83-2360-4db6-967c-633f7172e46b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.664 187518 DEBUG nova.compute.manager [req-4b360ef8-727a-40df-ac35-5963c45065d0 req-15c6fa83-2360-4db6-967c-633f7172e46b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Processing event network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.666 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.671 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764377931.671049, 375250f0-4625-4017-ac44-e74799c55dbf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.672 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] VM Resumed (Lifecycle Event)#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.675 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.681 187518 INFO nova.virt.libvirt.driver [-] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Instance spawned successfully.#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.682 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.714 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.720 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.720 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.721 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.722 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.722 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.723 187518 DEBUG nova.virt.libvirt.driver [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.732 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.769 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.796 187518 INFO nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Took 9.48 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.797 187518 DEBUG nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.810 187518 DEBUG nova.network.neutron [req-0eaa879f-ecd3-4222-a0b7-ffd036088cb3 req-0e56342a-fc7f-4d82-b361-02130b95c660 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Updated VIF entry in instance network info cache for port 4ce4680e-f578-4ef3-8110-b81c6011ca78. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.811 187518 DEBUG nova.network.neutron [req-0eaa879f-ecd3-4222-a0b7-ffd036088cb3 req-0e56342a-fc7f-4d82-b361-02130b95c660 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Updating instance_info_cache with network_info: [{"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.846 187518 DEBUG oslo_concurrency.lockutils [req-0eaa879f-ecd3-4222-a0b7-ffd036088cb3 req-0e56342a-fc7f-4d82-b361-02130b95c660 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-375250f0-4625-4017-ac44-e74799c55dbf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.884 187518 INFO nova.compute.manager [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Took 10.17 seconds to build instance.#033[00m
Nov 28 19:58:51 np0005539279 nova_compute[187514]: 2025-11-29 00:58:51.909 187518 DEBUG oslo_concurrency.lockutils [None req-c09cdab2-b126-47f8-883c-923aba71f75e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:52 np0005539279 nova_compute[187514]: 2025-11-29 00:58:52.474 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:53 np0005539279 nova_compute[187514]: 2025-11-29 00:58:53.779 187518 DEBUG nova.compute.manager [req-0a73ac46-568c-46f8-b0dd-7320e8538853 req-256a4e25-504b-482d-99a6-d373271497d1 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received event network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:58:53 np0005539279 nova_compute[187514]: 2025-11-29 00:58:53.779 187518 DEBUG oslo_concurrency.lockutils [req-0a73ac46-568c-46f8-b0dd-7320e8538853 req-256a4e25-504b-482d-99a6-d373271497d1 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:58:53 np0005539279 nova_compute[187514]: 2025-11-29 00:58:53.780 187518 DEBUG oslo_concurrency.lockutils [req-0a73ac46-568c-46f8-b0dd-7320e8538853 req-256a4e25-504b-482d-99a6-d373271497d1 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:58:53 np0005539279 nova_compute[187514]: 2025-11-29 00:58:53.781 187518 DEBUG oslo_concurrency.lockutils [req-0a73ac46-568c-46f8-b0dd-7320e8538853 req-256a4e25-504b-482d-99a6-d373271497d1 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:58:53 np0005539279 nova_compute[187514]: 2025-11-29 00:58:53.781 187518 DEBUG nova.compute.manager [req-0a73ac46-568c-46f8-b0dd-7320e8538853 req-256a4e25-504b-482d-99a6-d373271497d1 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] No waiting events found dispatching network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 19:58:53 np0005539279 nova_compute[187514]: 2025-11-29 00:58:53.782 187518 WARNING nova.compute.manager [req-0a73ac46-568c-46f8-b0dd-7320e8538853 req-256a4e25-504b-482d-99a6-d373271497d1 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received unexpected event network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 for instance with vm_state active and task_state None.#033[00m
Nov 28 19:58:54 np0005539279 nova_compute[187514]: 2025-11-29 00:58:54.545 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:55 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:58:55.409 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 19:58:56 np0005539279 podman[217096]: 2025-11-29 00:58:56.863800953 +0000 UTC m=+0.096379660 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 19:58:56 np0005539279 podman[217095]: 2025-11-29 00:58:56.88318216 +0000 UTC m=+0.121209135 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 19:58:57 np0005539279 nova_compute[187514]: 2025-11-29 00:58:57.476 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:59 np0005539279 nova_compute[187514]: 2025-11-29 00:58:59.548 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:58:59 np0005539279 podman[217139]: 2025-11-29 00:58:59.855318135 +0000 UTC m=+0.089760401 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 19:58:59 np0005539279 podman[217138]: 2025-11-29 00:58:59.906969609 +0000 UTC m=+0.139537564 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.494 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.529 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Triggering sync for uuid cdaab479-3862-458b-b200-b443c1647c78 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.530 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Triggering sync for uuid 375250f0-4625-4017-ac44-e74799c55dbf _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.530 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.531 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "cdaab479-3862-458b-b200-b443c1647c78" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.531 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.532 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "375250f0-4625-4017-ac44-e74799c55dbf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.583 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "cdaab479-3862-458b-b200-b443c1647c78" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:59:01 np0005539279 nova_compute[187514]: 2025-11-29 00:59:01.585 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "375250f0-4625-4017-ac44-e74799c55dbf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:59:02 np0005539279 nova_compute[187514]: 2025-11-29 00:59:02.479 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:04 np0005539279 nova_compute[187514]: 2025-11-29 00:59:04.550 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:59:04Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:13:f7:d4 10.100.0.27
Nov 28 19:59:04 np0005539279 ovn_controller[95686]: 2025-11-29T00:59:04Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:13:f7:d4 10.100.0.27
Nov 28 19:59:07 np0005539279 nova_compute[187514]: 2025-11-29 00:59:07.481 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:59:08.094 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:59:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:59:08.095 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:59:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 00:59:08.097 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:59:09 np0005539279 nova_compute[187514]: 2025-11-29 00:59:09.553 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:12 np0005539279 nova_compute[187514]: 2025-11-29 00:59:12.484 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:13 np0005539279 nova_compute[187514]: 2025-11-29 00:59:13.785 187518 DEBUG nova.compute.manager [req-09c39d0b-c0d5-48be-bf40-e6793648d673 req-62a0feef-9e4e-43b0-89e2-ef30c5d914bc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-changed-04107db0-1e00-49d9-8888-dd071f790f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 19:59:13 np0005539279 nova_compute[187514]: 2025-11-29 00:59:13.786 187518 DEBUG nova.compute.manager [req-09c39d0b-c0d5-48be-bf40-e6793648d673 req-62a0feef-9e4e-43b0-89e2-ef30c5d914bc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing instance network info cache due to event network-changed-04107db0-1e00-49d9-8888-dd071f790f24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 19:59:13 np0005539279 nova_compute[187514]: 2025-11-29 00:59:13.787 187518 DEBUG oslo_concurrency.lockutils [req-09c39d0b-c0d5-48be-bf40-e6793648d673 req-62a0feef-9e4e-43b0-89e2-ef30c5d914bc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:59:13 np0005539279 nova_compute[187514]: 2025-11-29 00:59:13.787 187518 DEBUG oslo_concurrency.lockutils [req-09c39d0b-c0d5-48be-bf40-e6793648d673 req-62a0feef-9e4e-43b0-89e2-ef30c5d914bc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:59:13 np0005539279 nova_compute[187514]: 2025-11-29 00:59:13.788 187518 DEBUG nova.network.neutron [req-09c39d0b-c0d5-48be-bf40-e6793648d673 req-62a0feef-9e4e-43b0-89e2-ef30c5d914bc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing network info cache for port 04107db0-1e00-49d9-8888-dd071f790f24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 19:59:14 np0005539279 nova_compute[187514]: 2025-11-29 00:59:14.554 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:14 np0005539279 podman[217198]: 2025-11-29 00:59:14.846883027 +0000 UTC m=+0.073182530 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 19:59:14 np0005539279 podman[217197]: 2025-11-29 00:59:14.851073931 +0000 UTC m=+0.081193138 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 19:59:14 np0005539279 podman[217196]: 2025-11-29 00:59:14.877097988 +0000 UTC m=+0.110310619 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 19:59:15 np0005539279 nova_compute[187514]: 2025-11-29 00:59:15.212 187518 DEBUG nova.network.neutron [req-09c39d0b-c0d5-48be-bf40-e6793648d673 req-62a0feef-9e4e-43b0-89e2-ef30c5d914bc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updated VIF entry in instance network info cache for port 04107db0-1e00-49d9-8888-dd071f790f24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 19:59:15 np0005539279 nova_compute[187514]: 2025-11-29 00:59:15.213 187518 DEBUG nova.network.neutron [req-09c39d0b-c0d5-48be-bf40-e6793648d673 req-62a0feef-9e4e-43b0-89e2-ef30c5d914bc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:59:15 np0005539279 nova_compute[187514]: 2025-11-29 00:59:15.240 187518 DEBUG oslo_concurrency.lockutils [req-09c39d0b-c0d5-48be-bf40-e6793648d673 req-62a0feef-9e4e-43b0-89e2-ef30c5d914bc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:59:17 np0005539279 nova_compute[187514]: 2025-11-29 00:59:17.488 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:19 np0005539279 nova_compute[187514]: 2025-11-29 00:59:19.571 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:20 np0005539279 nova_compute[187514]: 2025-11-29 00:59:20.648 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.490 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.641 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.642 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.643 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.643 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.761 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.856 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.859 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.941 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:59:22 np0005539279 nova_compute[187514]: 2025-11-29 00:59:22.952 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.051 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.053 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.151 187518 DEBUG oslo_concurrency.processutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.375 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.376 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5463MB free_disk=73.28161239624023GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.377 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.377 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.492 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Instance cdaab479-3862-458b-b200-b443c1647c78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.493 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Instance 375250f0-4625-4017-ac44-e74799c55dbf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.493 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.494 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.535 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing inventories for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.572 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating ProviderTree inventory for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.573 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.599 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing aggregate associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.632 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing trait associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, traits: COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,HW_CPU_X86_SSE,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.717 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.794 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.821 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 19:59:23 np0005539279 nova_compute[187514]: 2025-11-29 00:59:23.822 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 19:59:24 np0005539279 nova_compute[187514]: 2025-11-29 00:59:24.575 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:24 np0005539279 nova_compute[187514]: 2025-11-29 00:59:24.822 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:24 np0005539279 nova_compute[187514]: 2025-11-29 00:59:24.823 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 19:59:24 np0005539279 nova_compute[187514]: 2025-11-29 00:59:24.823 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 19:59:25 np0005539279 nova_compute[187514]: 2025-11-29 00:59:25.362 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 19:59:25 np0005539279 nova_compute[187514]: 2025-11-29 00:59:25.363 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 19:59:25 np0005539279 nova_compute[187514]: 2025-11-29 00:59:25.364 187518 DEBUG nova.network.neutron [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 19:59:25 np0005539279 nova_compute[187514]: 2025-11-29 00:59:25.364 187518 DEBUG nova.objects.instance [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 19:59:27 np0005539279 nova_compute[187514]: 2025-11-29 00:59:27.493 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:27 np0005539279 nova_compute[187514]: 2025-11-29 00:59:27.840 187518 DEBUG nova.network.neutron [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 19:59:27 np0005539279 podman[217276]: 2025-11-29 00:59:27.871750783 +0000 UTC m=+0.102462126 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 19:59:27 np0005539279 nova_compute[187514]: 2025-11-29 00:59:27.875 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 19:59:27 np0005539279 nova_compute[187514]: 2025-11-29 00:59:27.875 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 19:59:27 np0005539279 nova_compute[187514]: 2025-11-29 00:59:27.876 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:27 np0005539279 nova_compute[187514]: 2025-11-29 00:59:27.876 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:27 np0005539279 nova_compute[187514]: 2025-11-29 00:59:27.877 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 19:59:27 np0005539279 podman[217275]: 2025-11-29 00:59:27.88784301 +0000 UTC m=+0.124117095 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 19:59:28 np0005539279 nova_compute[187514]: 2025-11-29 00:59:28.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:28 np0005539279 nova_compute[187514]: 2025-11-29 00:59:28.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:28 np0005539279 nova_compute[187514]: 2025-11-29 00:59:28.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:29 np0005539279 nova_compute[187514]: 2025-11-29 00:59:29.606 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:29 np0005539279 nova_compute[187514]: 2025-11-29 00:59:29.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 19:59:30 np0005539279 podman[217323]: 2025-11-29 00:59:30.847557437 +0000 UTC m=+0.072139742 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 19:59:30 np0005539279 podman[217322]: 2025-11-29 00:59:30.873709418 +0000 UTC m=+0.116301782 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.312 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cdaab479-3862-458b-b200-b443c1647c78', 'name': 'tempest-TestNetworkBasicOps-server-2097217943', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0df0de37c7d74836a2135b0d6ff3a067', 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'hostId': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.316 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '375250f0-4625-4017-ac44-e74799c55dbf', 'name': 'tempest-TestNetworkBasicOps-server-789600378', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0df0de37c7d74836a2135b0d6ff3a067', 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'hostId': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.366 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.write.bytes volume: 73269248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.367 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.416 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.write.bytes volume: 72941568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.416 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9302f992-b3df-45b3-827a-d0c8c736e3dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73269248, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.316967', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa6de930-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': 'c9e3caad93a11f56f8d1be759285a04c9ab26fe2b3920b6a4d87ca534ae58bfb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.316967', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa6dfe70-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '2144c45b1f696b98f399041c5c385e1e820841859bf8801b1d48f55f4615cace'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72941568, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.316967', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa758096-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '461889ef5b820773dd23bec83ef8ed8ac8d5e222aa3b614eeae474b11404a76f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.316967', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa759824-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '0928362b7b01a2b8ce1b972bb3d7a22936d95aba162f111431c441c1a92b13c5'}]}, 'timestamp': '2025-11-29 00:59:32.417367', '_unique_id': '391da2638e8f4057937896de2d09ec3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.420 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.448 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/cpu volume: 11560000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.476 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/cpu volume: 10850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5938d4ab-bdca-4ff9-8b79-ffb75262fd1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11560000000, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'timestamp': '2025-11-29T00:59:32.420726', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'aa7a76e6-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.207125275, 'message_signature': '063a5e97fa1f89e99a58b29a8b2f2289a4c850aa758ef16d35c42e936849cabf'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10850000000, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'timestamp': '2025-11-29T00:59:32.420726', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'aa7eacd4-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.234780707, 'message_signature': '843ed03f23648a47a6a69676c44652ca9f16d6d8f127d54085ce8d319214a57f'}]}, 'timestamp': '2025-11-29 00:59:32.476901', '_unique_id': 'da2803708e134af7977f3b67440056a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.478 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.479 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.483 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cdaab479-3862-458b-b200-b443c1647c78 / tap619a3ed2-fa inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.484 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cdaab479-3862-458b-b200-b443c1647c78 / tap04107db0-1e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.484 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.485 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.488 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 375250f0-4625-4017-ac44-e74799c55dbf / tap4ce4680e-f5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.488 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a8c3034-3089-408c-a22f-34e6a62017aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.480131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa7ff166-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '456492c649a54688300a13b82a0e06775406d3c5ed0104d2b988b6ad7407050a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.480131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa800552-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '4f47626aee92c873bf1e3419dd73d89c91a07b299daafa3a19c303ef7433299d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.480131', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa8096c0-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': '01199055e6ee11d4ffc531a0ae5e47a1a7f5a0d59617f820eae4e57ce48d9918'}]}, 'timestamp': '2025-11-29 00:59:32.489447', '_unique_id': 'bd79f028b6474a7ab79034676baf8fca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.490 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.492 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.492 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.492 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2097217943>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-789600378>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2097217943>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-789600378>]
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.492 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 19:59:32 np0005539279 nova_compute[187514]: 2025-11-29 00:59:32.496 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.511 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.usage volume: 30146560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.511 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.530 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.530 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0bf06989-42b9-4631-8412-63626dd4ef0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30146560, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.492888', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa83f626-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.251488081, 'message_signature': '391b1f4eee1c919d41fe31bb1daf6e567137b618a4be89fe30722054530aec69'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.492888', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa840670-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.251488081, 'message_signature': 'ef0c561d2b9475b1cb5928df8966260db97bbf82580e6d9327edd9415f5d99fe'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.492888', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa86eb88-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.270403655, 'message_signature': '60d803e6971a6eab92fc63e5c65eae7799d904497fef862f6ac00a4c2736158b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.492888', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa87003c-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.270403655, 'message_signature': 'e6f44b74d4dae46f3ca48ec620f4d47362237d0504379f372c3d27712ce33a87'}]}, 'timestamp': '2025-11-29 00:59:32.531440', '_unique_id': '6f2374560adf41589780691163a12893'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.533 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.534 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.534 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.write.latency volume: 1776996651 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.535 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.535 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.write.latency volume: 2707195856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.536 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cccf8816-4f75-4d15-b5da-ca1f9f6c58e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1776996651, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.534746', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa87960a-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '5e50103b2c09450568b4f899594860a06a3872a0173f100deb0b07a7a022cbb2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.534746', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa87a8de-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': 'b3008a82b1347868b69c6400b9c6724ba228e807e207e6e5e48abf4102e6692c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2707195856, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.534746', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa87bd56-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': 'c8a0f8696a44773075444adb48951094bf02e61341724e7a58d76bf989eadeb2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.534746', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa87cf8a-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '79cc388616036953bec90068f0d8ba971774fdd58c16ba132fddb612f6845046'}]}, 'timestamp': '2025-11-29 00:59:32.536724', '_unique_id': 'bea589342a5e401a9adb2a3ad51f3eec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.537 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.539 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.539 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.packets volume: 482 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.539 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.540 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.incoming.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6116a023-9d2f-4cb6-920d-82551307922c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 482, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.539346', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa8849a6-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '78ec2918a49b991e9136f57837c379a0fac684e561afaf83b52e83292a1e3e33'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 26, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.539346', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa885d42-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '7c0b65db7368c5a6069cce748cbbc453fda1dcb81651a5b189e6396f57522433'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.539346', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa887048-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': 'b81c1cfab767a4c46b63f16e840bf3aec4e6e8cc61eab05408c5a0374242bfd9'}]}, 'timestamp': '2025-11-29 00:59:32.540899', '_unique_id': 'a5713d5e87a445e695b0983e05553a95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.542 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.543 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.543 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.544 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.544 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee259881-5c32-4dbe-aa4d-ed84b6408fcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.543400', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa88e906-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': 'e08940c93afab4b11a63e37790fb56bdca95104c7f096e724c4e9ca6c8497d18'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.543400', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa88fe28-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '762bdf3de6d6a6af97159533c010c4193d3af57e589e026e79845c23c100b241'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.543400', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa891156-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': 'cbe06b5ee3fd2120767a26e188369462f9c9edac7921d942192f70ef395d26cc'}]}, 'timestamp': '2025-11-29 00:59:32.544975', '_unique_id': '56abed3d2a1e47e89683816654ec18c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.546 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.547 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.547 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.bytes volume: 82076 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.548 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.bytes volume: 3872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.548 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.outgoing.bytes volume: 2740 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65ed904a-68c8-49b6-91c6-2a82bd7ecba6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 82076, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.547586', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa898bae-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': 'ff4c6835f26921e591e4545c912d828024ba79e20230660de603026a8545a7cf'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3872, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.547586', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa899ffe-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '662f98d6a277e28b8bee39098abe97f1afbaea71fa8be4099d013569d700f899'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2740, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.547586', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa89b2e6-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': 'd45f0f15476a661c8e48097b06283faea87131e57b69eac2af5be119c1b3c328'}]}, 'timestamp': '2025-11-29 00:59:32.549106', '_unique_id': 'bb5a5ffc7cff47b8be0f547197436804'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.550 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.551 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.551 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.552 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.552 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ba1309a-c6b5-4d74-a7ee-da5a485ed499', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.551647', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa8a2a6e-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': 'a02f77f414c46dcf3719ac99892fe0ca802fbead0efefde00aefa9b3daa897a8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.551647', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa8a3c5c-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '3b26953a99ee520e8103a37084d357c29fe61d5479f93e548b48b194f38fcf49'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.551647', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa8a4f6c-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': '06dbaa6130bbe15bccb69acafa43ee0197fb01ad131c19fb13be4d79b732ed45'}]}, 'timestamp': '2025-11-29 00:59:32.553118', '_unique_id': '7afb4ca4d5514f56ba1500c26ae89899'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.554 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.555 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.555 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/memory.usage volume: 42.7109375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.556 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/memory.usage volume: 42.80078125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3d38605-d4e8-461c-9a16-bd1dbaa4b582', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7109375, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'timestamp': '2025-11-29T00:59:32.555637', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'aa8ac5be-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.207125275, 'message_signature': '192dec987edf40a4a8bb0a50e3542e9237dec166c3f28a7a5ffaea3f307d1168'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.80078125, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'timestamp': '2025-11-29T00:59:32.555637', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'aa8ad6d0-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.234780707, 'message_signature': '47dc4806e4d549e79d60909aacef5b4ebdb0a3095d373f43b59e90eb0d54bf2c'}]}, 'timestamp': '2025-11-29 00:59:32.556608', '_unique_id': '1f5e098892a340e9af11820b7c771520'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.557 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.558 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.559 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.559 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.560 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.560 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70e53f77-b7ee-4d14-a3b3-bff13b0bc542', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.558958', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8b45a2-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.251488081, 'message_signature': 'df5966c137d24800a1b78a08b918fe5dd94cb2ef1c2a402d33432216844f9ce6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.558958', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8b5c90-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.251488081, 'message_signature': 'c522c5971b8a40614bf73a2dcf4c8204709ba3fa683dca2234c9e6fb6e7dbbde'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.558958', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8b7054-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.270403655, 'message_signature': 'c574731cf3eadc2de68841a1931853e3cabba6a618632264bf33704586323250'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.558958', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8b8288-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.270403655, 'message_signature': '13e0ead06ccd021f63d6101b87ac827961ea57bab7ab7e946ccbff4660f134d9'}]}, 'timestamp': '2025-11-29 00:59:32.560962', '_unique_id': '863ad6424b694b3d918aa405efd23f1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.562 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.563 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.563 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.564 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2097217943>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-789600378>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2097217943>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-789600378>]
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.564 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.564 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.read.latency volume: 279150014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.564 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.read.latency volume: 33862670 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.565 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.read.latency volume: 254642415 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.565 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.read.latency volume: 27283660 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '446e4f22-53f7-460e-a1fb-0b44c82ddb5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 279150014, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.564513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8c1f90-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '640ba52310885ec2700490acb99f61a5f6ac113db8b0c4801f83d32eda6d1b50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33862670, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.564513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8c3098-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '6a44381fcd35022a4ce3a4c442a5497391e8dd18ed2f065d45b97217fdf71bb3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 254642415, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.564513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8c4286-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '8ab5628439e600393a9b90e33fdd4dc2663b37df892c393f4dba497c9847da0a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27283660, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.564513', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8c5550-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '8fa683ca9056d2670bdf015e77ba6467d85ed05463b26cf4f1fa75873e85fa19'}]}, 'timestamp': '2025-11-29 00:59:32.566360', '_unique_id': 'a9bd61794935432d91842eec39b891c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.567 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.568 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.569 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.read.bytes volume: 30460416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.569 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.570 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.read.bytes volume: 31300096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.570 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cbccc2a-eae0-4b72-b7d2-804c9f993a18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30460416, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.569088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8cd1ec-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': 'c23fe420bc86931136f022f64a827ff980e520d83d5682da95ca3a6e3a2c84f4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.569088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8ce736-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '620af093982549ccd1a5fec23e57792a2856a98ec88452a99470c0bd654d6603'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31300096, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.569088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8cf7e4-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '9dbd5c665a69255ec33f2feac718fa9e48a8a24b0cd2a3950dedef4a6f6afe67'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.569088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8d0c52-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': 'd85bf7cb780f6e7065feaa103546a2f31a07b692f901750ff161f2a43166be41'}]}, 'timestamp': '2025-11-29 00:59:32.571049', '_unique_id': '331a5b6958d14825b99ce6c5978d3547'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.572 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.573 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.573 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.packets volume: 538 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.573 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.packets volume: 40 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.573 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66c94c2d-f94a-460d-b38c-eec54abcd1ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 538, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.573338', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa8d7502-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': 'bac4fddfe396b0418e44c3712e903b4948e261c649151c6667ba7834f4ac25f5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 40, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.573338', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa8d8024-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '3a2eb3cf5b558acd49485b7c368caf5bad1d497e7d774c061a65fd1b48cba919'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.573338', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa8d8a92-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': '36fd0419978da3a43f957d99f627b69a0ab7f6a424f2dc0b7c55e4efb499c270'}]}, 'timestamp': '2025-11-29 00:59:32.574205', '_unique_id': '6e5fc77c75a14aa8bfe9c23dbfd998e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.574 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.575 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.575 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.read.requests volume: 1104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.576 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.576 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.read.requests volume: 1125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.576 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ec36b3d-fc54-4ce2-a553-3f7e0a0cf47e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1104, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.575704', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8dd07e-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '8d64a2f940aca2ce013aa207dfc98dc8ea7db06d1bb99b50b62e5aacab3ff330'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.575704', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8ddbf0-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '5bc2f04587885b9be3bff06fbbc6e9d09e6bfc1a52cadd5c17d71d906034a9d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1125, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.575704', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8de60e-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '41052d0d366ada55692ed6a3db7f729d19c45e3a035069e735aba16f98b6040f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.575704', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8df1a8-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '8941dc75a850fe00a58976cff7b002fa4930a616fe75dda8419511b452bfeffa'}]}, 'timestamp': '2025-11-29 00:59:32.576836', '_unique_id': 'ce7dc3b1082548ab8a62f18d54480adc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.577 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.578 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.578 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.578 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.579 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f96fa56c-0490-42ed-b316-8979b391a4d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.578553', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa8e4018-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '2779afa7ad691935d867340ba36eefff4ce6263f00667a9e1a76310cbee83cbc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.578553', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa8e4b30-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': 'b2c4ab5c970dc3a708b0a7e91d92cec8067106d1a6fc93977447590580990fad'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.578553', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa8e5594-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': '72537cf4969b6bacd5d4c5779d8f047d947f55560fe840c9ad036c6779651ba2'}]}, 'timestamp': '2025-11-29 00:59:32.579465', '_unique_id': '4a4f7fad9c8c4106a2a258f395038f35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.580 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.581 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.581 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.581 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.581 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3dc065ee-0d1f-48bb-b2fd-7fc98dd461d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.581187', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa8ea6ac-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': 'fd65cd90ade348e0f562a313132ab25e067aafe4f8f4aa549968f4074941f50e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.581187', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa8eb2a0-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': 'c33d02b72dade42e9846db1c9028aaa2daaf5f0a98d0cbbe90cfd48fd7db99c1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.581187', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa8ebcdc-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': '8dc2af2288918c93ae19417415ab6303bfeaa678ba646031afaffee9b1182d5a'}]}, 'timestamp': '2025-11-29 00:59:32.582047', '_unique_id': 'aa4c0e1608064909b7667089f19e0dd0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.582 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.583 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.583 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.584 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.584 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17668957-7472-4298-a66d-ef1e70a6f1f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.583713', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa8f096c-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '8f033da4e702ef070cb55dadd012b5e7116d216883394084d0723ad8eb57cd72'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.583713', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa8f157e-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '3cd3e199b51b5b52f61301f3574f3323d33383efc2ad953f828b9142868c90a0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.583713', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa8f2096-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': '592f256273a4ec8f3357c5290f9b8159acb86bc22abd61a980d4cb9be43cb7d8'}]}, 'timestamp': '2025-11-29 00:59:32.584653', '_unique_id': '16894a6286c24301bb60166027fc1549'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.585 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.586 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.586 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.586 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2097217943>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-789600378>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2097217943>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-789600378>]
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.586 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.586 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.bytes volume: 93288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.586 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/network.incoming.bytes volume: 2660 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.587 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/network.incoming.bytes volume: 3130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f10ca71-6581-426d-a286-59abedcfff79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 93288, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap619a3ed2-fa', 'timestamp': '2025-11-29T00:59:32.586597', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap619a3ed2-fa', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:68:b1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap619a3ed2-fa'}, 'message_id': 'aa8f7a32-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': '89efeaedf62f218c4c8eb4926d943bf18168155546ad448640896effddb9cd2d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2660, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000006-cdaab479-3862-458b-b200-b443c1647c78-tap04107db0-1e', 'timestamp': '2025-11-29T00:59:32.586597', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'tap04107db0-1e', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:9f:7c:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap04107db0-1e'}, 'message_id': 'aa8f870c-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.238737185, 'message_signature': 'b018ddf27cca6db06d8707e7fb3d051f23b07dcd8f4444ed7b2263663e7dbe0f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3130, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'instance-00000007-375250f0-4625-4017-ac44-e74799c55dbf-tap4ce4680e-f5', 'timestamp': '2025-11-29T00:59:32.586597', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'tap4ce4680e-f5', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:13:f7:d4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4ce4680e-f5'}, 'message_id': 'aa8f91e8-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.244298246, 'message_signature': 'bb6e558755df9ac5f0717b1cc8cb53b887f3a25533449c40e80abfcdb45e2a4c'}]}, 'timestamp': '2025-11-29 00:59:32.587528', '_unique_id': 'd17865631552450a95baf2c91ccb4aa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.588 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.589 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.589 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2097217943>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-789600378>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-2097217943>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-789600378>]
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.589 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.589 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.589 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.590 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.590 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4193d2f1-7e66-4fc1-8e6d-62ad7ae7a08e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.589385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8fe79c-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.251488081, 'message_signature': '67a40f2b987bb3c0c2b10cf6864d7e55a81b6479444913bb445b1e1a663af8ab'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.589385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa8ff3b8-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.251488081, 'message_signature': 'c1b42939bfe036a02485246dd4dc75ceac6e3d5ffd3ecfb5ee0f9ac29b7dc5c0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.589385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa8ffe26-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.270403655, 'message_signature': '401c04c21473f911e785307c40246c0f8f1c953e1578111a4dbc652ec8f1fa4d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.589385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa900830-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.270403655, 'message_signature': '0b89d123e978466662d00c02bd9c6125554d7c9a0ef2d1954bbc580c7f9cebe3'}]}, 'timestamp': '2025-11-29 00:59:32.590546', '_unique_id': '2c8b78c327ad4f1da7ecf4e2f76ec764'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.591 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.592 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.592 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.write.requests volume: 349 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.592 12 DEBUG ceilometer.compute.pollsters [-] cdaab479-3862-458b-b200-b443c1647c78/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.592 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.write.requests volume: 313 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.593 12 DEBUG ceilometer.compute.pollsters [-] 375250f0-4625-4017-ac44-e74799c55dbf/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '415bd37a-7b29-45be-9020-cc64ade038ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 349, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-vda', 'timestamp': '2025-11-29T00:59:32.592298', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa9058a8-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '887fe75aef422fb4407173aa2e6e819b60a1f3620f7e6735d4bf1dd02c767657'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': 'cdaab479-3862-458b-b200-b443c1647c78-sda', 'timestamp': '2025-11-29T00:59:32.592298', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-2097217943', 'name': 'instance-00000006', 'instance_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa906488-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.0755923, 'message_signature': '3132dc94bebde1ffe528eb430d3c9cf676487d83a59cfdde2bfc8dc28ff9f697'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 313, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-vda', 'timestamp': '2025-11-29T00:59:32.592298', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'aa906e88-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '610c51cc1fd275be827869c6ab5d8d50dd330d2677dd0d6118b0c86649ed8615'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_name': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_name': None, 'resource_id': '375250f0-4625-4017-ac44-e74799c55dbf-sda', 'timestamp': '2025-11-29T00:59:32.592298', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-789600378', 'name': 'instance-00000007', 'instance_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'instance_type': 'm1.nano', 'host': '5c34750f3c699e18c7bcd4785759257012cad80aabf625985f2241d8', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6ce17e5f-9ac5-497d-adc9-1357453b4367', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '017f04d5-006e-46df-a06f-ac852f70dddf'}, 'image_ref': '017f04d5-006e-46df-a06f-ac852f70dddf', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'aa907a36-ccbe-11f0-af57-fa163e74b97a', 'monotonic_time': 3906.126178105, 'message_signature': '783c3598608a4009fd9fb2ef50ae8792285e5eb77687fe5305e79af99d8530eb'}]}, 'timestamp': '2025-11-29 00:59:32.593443', '_unique_id': '94b248960d41401b8f565afad226120d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 19:59:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 00:59:32.594 12 ERROR oslo_messaging.notify.messaging 
Nov 28 19:59:34 np0005539279 nova_compute[187514]: 2025-11-29 00:59:34.608 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:37 np0005539279 nova_compute[187514]: 2025-11-29 00:59:37.498 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:39 np0005539279 nova_compute[187514]: 2025-11-29 00:59:39.611 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:42 np0005539279 nova_compute[187514]: 2025-11-29 00:59:42.501 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:43 np0005539279 ovn_controller[95686]: 2025-11-29T00:59:43Z|00107|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Nov 28 19:59:44 np0005539279 nova_compute[187514]: 2025-11-29 00:59:44.613 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:45 np0005539279 podman[217375]: 2025-11-29 00:59:45.85236086 +0000 UTC m=+0.088428905 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 19:59:45 np0005539279 podman[217374]: 2025-11-29 00:59:45.862790683 +0000 UTC m=+0.103393431 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 19:59:45 np0005539279 podman[217376]: 2025-11-29 00:59:45.866871264 +0000 UTC m=+0.096173865 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 19:59:47 np0005539279 nova_compute[187514]: 2025-11-29 00:59:47.503 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:49 np0005539279 nova_compute[187514]: 2025-11-29 00:59:49.652 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:52 np0005539279 nova_compute[187514]: 2025-11-29 00:59:52.507 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:54 np0005539279 nova_compute[187514]: 2025-11-29 00:59:54.655 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:57 np0005539279 nova_compute[187514]: 2025-11-29 00:59:57.510 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 19:59:58 np0005539279 podman[217436]: 2025-11-29 00:59:58.844116777 +0000 UTC m=+0.077166899 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 19:59:58 np0005539279 podman[217435]: 2025-11-29 00:59:58.875833539 +0000 UTC m=+0.112868639 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Nov 28 19:59:59 np0005539279 nova_compute[187514]: 2025-11-29 00:59:59.660 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.560 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.561 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.561 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.562 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.562 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.565 187518 INFO nova.compute.manager [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Terminating instance#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.567 187518 DEBUG nova.compute.manager [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 20:00:00 np0005539279 kernel: tap4ce4680e-f5 (unregistering): left promiscuous mode
Nov 28 20:00:00 np0005539279 NetworkManager[55703]: <info>  [1764378000.5972] device (tap4ce4680e-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.616 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:00Z|00108|binding|INFO|Releasing lport 4ce4680e-f578-4ef3-8110-b81c6011ca78 from this chassis (sb_readonly=0)
Nov 28 20:00:00 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:00Z|00109|binding|INFO|Setting lport 4ce4680e-f578-4ef3-8110-b81c6011ca78 down in Southbound
Nov 28 20:00:00 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:00Z|00110|binding|INFO|Removing iface tap4ce4680e-f5 ovn-installed in OVS
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.619 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.636 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.640 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:13:f7:d4 10.100.0.27'], port_security=['fa:16:3e:13:f7:d4 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '375250f0-4625-4017-ac44-e74799c55dbf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beb28e65-81a9-4c61-962b-bcd4d536483d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ef49e330-de8e-4b91-bb66-5d9b2f8c3106', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d853b3f-840e-4869-bcf7-a67e0ea8364c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=4ce4680e-f578-4ef3-8110-b81c6011ca78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.642 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 4ce4680e-f578-4ef3-8110-b81c6011ca78 in datapath beb28e65-81a9-4c61-962b-bcd4d536483d unbound from our chassis#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.643 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network beb28e65-81a9-4c61-962b-bcd4d536483d#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.657 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[d6a673ba-a74e-4a3a-aabf-7369fa531082]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:00 np0005539279 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 28 20:00:00 np0005539279 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 14.965s CPU time.
Nov 28 20:00:00 np0005539279 systemd-machined[153752]: Machine qemu-7-instance-00000007 terminated.
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.696 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[b454e228-4249-40b3-9ef9-e904ed3a84e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.699 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[87150d0a-2471-4767-91f6-84da0c1fe3cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.736 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7d2419-deb7-4900-915a-d4700d8f59cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.761 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[5eab9d6a-9e30-4abe-9a00-feb5a0698ff9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbeb28e65-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:4c:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 8, 'rx_bytes': 742, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 8, 'rx_bytes': 742, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384666, 'reachable_time': 21671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217493, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.780 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c69c2760-dc56-4e89-809a-b842a4d56314]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbeb28e65-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384683, 'tstamp': 384683}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217494, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapbeb28e65-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384687, 'tstamp': 384687}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217494, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.782 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbeb28e65-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.783 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.789 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.789 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbeb28e65-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.789 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.790 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbeb28e65-80, col_values=(('external_ids', {'iface-id': '8657db2e-0ad1-471b-be5c-ea510f417caf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:00.790 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.838 187518 INFO nova.virt.libvirt.driver [-] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Instance destroyed successfully.#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.838 187518 DEBUG nova.objects.instance [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid 375250f0-4625-4017-ac44-e74799c55dbf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.879 187518 DEBUG nova.virt.libvirt.vif [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:58:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-789600378',display_name='tempest-TestNetworkBasicOps-server-789600378',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-789600378',id=7,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKqxNWaE/K8iXOUK4SxlB5irt2bYuR4n+JtaAN+FnQAgYJ2yndSRSNkEbwzuHQdRKn+8qNjOtQlC+1yxvPDcocUJM6LWl4jABxMpf5CHJW7UphTphp1DuQz5nJcotlpFOg==',key_name='tempest-TestNetworkBasicOps-2086077250',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:58:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-dm6z0msm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:58:51Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=375250f0-4625-4017-ac44-e74799c55dbf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.880 187518 DEBUG nova.network.os_vif_util [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "address": "fa:16:3e:13:f7:d4", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ce4680e-f5", "ovs_interfaceid": "4ce4680e-f578-4ef3-8110-b81c6011ca78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.881 187518 DEBUG nova.network.os_vif_util [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:13:f7:d4,bridge_name='br-int',has_traffic_filtering=True,id=4ce4680e-f578-4ef3-8110-b81c6011ca78,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ce4680e-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.881 187518 DEBUG os_vif [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:f7:d4,bridge_name='br-int',has_traffic_filtering=True,id=4ce4680e-f578-4ef3-8110-b81c6011ca78,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ce4680e-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.883 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.884 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ce4680e-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.885 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.888 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.890 187518 INFO os_vif [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:13:f7:d4,bridge_name='br-int',has_traffic_filtering=True,id=4ce4680e-f578-4ef3-8110-b81c6011ca78,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ce4680e-f5')#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.891 187518 INFO nova.virt.libvirt.driver [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Deleting instance files /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf_del#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.892 187518 INFO nova.virt.libvirt.driver [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Deletion of /var/lib/nova/instances/375250f0-4625-4017-ac44-e74799c55dbf_del complete#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.973 187518 INFO nova.compute.manager [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.973 187518 DEBUG oslo.service.loopingcall [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.974 187518 DEBUG nova.compute.manager [-] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 20:00:00 np0005539279 nova_compute[187514]: 2025-11-29 01:00:00.974 187518 DEBUG nova.network.neutron [-] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 20:00:01 np0005539279 nova_compute[187514]: 2025-11-29 01:00:01.645 187518 DEBUG nova.compute.manager [req-67200a54-2092-4ba2-84bf-6fb2c7333dc6 req-5d274128-4f3a-4c79-895d-9730e232eeaf 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received event network-vif-unplugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:01 np0005539279 nova_compute[187514]: 2025-11-29 01:00:01.646 187518 DEBUG oslo_concurrency.lockutils [req-67200a54-2092-4ba2-84bf-6fb2c7333dc6 req-5d274128-4f3a-4c79-895d-9730e232eeaf 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:01 np0005539279 nova_compute[187514]: 2025-11-29 01:00:01.647 187518 DEBUG oslo_concurrency.lockutils [req-67200a54-2092-4ba2-84bf-6fb2c7333dc6 req-5d274128-4f3a-4c79-895d-9730e232eeaf 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:01 np0005539279 nova_compute[187514]: 2025-11-29 01:00:01.647 187518 DEBUG oslo_concurrency.lockutils [req-67200a54-2092-4ba2-84bf-6fb2c7333dc6 req-5d274128-4f3a-4c79-895d-9730e232eeaf 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:01 np0005539279 nova_compute[187514]: 2025-11-29 01:00:01.648 187518 DEBUG nova.compute.manager [req-67200a54-2092-4ba2-84bf-6fb2c7333dc6 req-5d274128-4f3a-4c79-895d-9730e232eeaf 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] No waiting events found dispatching network-vif-unplugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:01 np0005539279 nova_compute[187514]: 2025-11-29 01:00:01.648 187518 DEBUG nova.compute.manager [req-67200a54-2092-4ba2-84bf-6fb2c7333dc6 req-5d274128-4f3a-4c79-895d-9730e232eeaf 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received event network-vif-unplugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 20:00:01 np0005539279 podman[217513]: 2025-11-29 01:00:01.84600798 +0000 UTC m=+0.082638197 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:00:01 np0005539279 podman[217512]: 2025-11-29 01:00:01.867248797 +0000 UTC m=+0.115469589 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller)
Nov 28 20:00:02 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:02.407 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.408 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:02 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:02.410 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.425 187518 DEBUG nova.network.neutron [-] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.442 187518 INFO nova.compute.manager [-] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Took 1.47 seconds to deallocate network for instance.#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.484 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.485 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.498 187518 DEBUG nova.compute.manager [req-3a238a2d-7d7f-45ae-81a0-bdf8c563f006 req-c693a1be-05e6-41b1-94f3-f83e0ca27b8b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received event network-vif-deleted-4ce4680e-f578-4ef3-8110-b81c6011ca78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.546 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.577 187518 DEBUG nova.compute.provider_tree [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.595 187518 DEBUG nova.scheduler.client.report [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.623 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.673 187518 INFO nova.scheduler.client.report [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance 375250f0-4625-4017-ac44-e74799c55dbf#033[00m
Nov 28 20:00:02 np0005539279 nova_compute[187514]: 2025-11-29 01:00:02.813 187518 DEBUG oslo_concurrency.lockutils [None req-80a7112d-286d-464d-9c34-fdabb7a47a87 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:03 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:03.413 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.746 187518 DEBUG nova.compute.manager [req-8dbaa6fd-a734-48c4-80a4-41a4f0270ed3 req-7042f4d2-f1fa-46a3-8298-6e8356314c69 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received event network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.746 187518 DEBUG oslo_concurrency.lockutils [req-8dbaa6fd-a734-48c4-80a4-41a4f0270ed3 req-7042f4d2-f1fa-46a3-8298-6e8356314c69 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "375250f0-4625-4017-ac44-e74799c55dbf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.747 187518 DEBUG oslo_concurrency.lockutils [req-8dbaa6fd-a734-48c4-80a4-41a4f0270ed3 req-7042f4d2-f1fa-46a3-8298-6e8356314c69 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.747 187518 DEBUG oslo_concurrency.lockutils [req-8dbaa6fd-a734-48c4-80a4-41a4f0270ed3 req-7042f4d2-f1fa-46a3-8298-6e8356314c69 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "375250f0-4625-4017-ac44-e74799c55dbf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.747 187518 DEBUG nova.compute.manager [req-8dbaa6fd-a734-48c4-80a4-41a4f0270ed3 req-7042f4d2-f1fa-46a3-8298-6e8356314c69 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] No waiting events found dispatching network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.747 187518 WARNING nova.compute.manager [req-8dbaa6fd-a734-48c4-80a4-41a4f0270ed3 req-7042f4d2-f1fa-46a3-8298-6e8356314c69 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Received unexpected event network-vif-plugged-4ce4680e-f578-4ef3-8110-b81c6011ca78 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.779 187518 DEBUG oslo_concurrency.lockutils [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "interface-cdaab479-3862-458b-b200-b443c1647c78-04107db0-1e00-49d9-8888-dd071f790f24" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.780 187518 DEBUG oslo_concurrency.lockutils [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "interface-cdaab479-3862-458b-b200-b443c1647c78-04107db0-1e00-49d9-8888-dd071f790f24" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.805 187518 DEBUG nova.objects.instance [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'flavor' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.836 187518 DEBUG nova.virt.libvirt.vif [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:58:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:58:04Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.837 187518 DEBUG nova.network.os_vif_util [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.838 187518 DEBUG nova.network.os_vif_util [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.844 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.848 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.852 187518 DEBUG nova.virt.libvirt.driver [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Attempting to detach device tap04107db0-1e from instance cdaab479-3862-458b-b200-b443c1647c78 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.853 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] detach device xml: <interface type="ethernet">
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <mac address="fa:16:3e:9f:7c:e0"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <model type="virtio"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <driver name="vhost" rx_queue_size="512"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <mtu size="1442"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <target dev="tap04107db0-1e"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]: </interface>
Nov 28 20:00:03 np0005539279 nova_compute[187514]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.869 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.874 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <name>instance-00000006</name>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <uuid>cdaab479-3862-458b-b200-b443c1647c78</uuid>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <nova:name>tempest-TestNetworkBasicOps-server-2097217943</nova:name>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <nova:creationTime>2025-11-29 00:58:32</nova:creationTime>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <nova:flavor name="m1.nano">
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:memory>128</nova:memory>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:disk>1</nova:disk>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:swap>0</nova:swap>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:vcpus>1</nova:vcpus>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </nova:flavor>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <nova:owner>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </nova:owner>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <nova:ports>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:port uuid="619a3ed2-fa55-4d60-8e37-9fd4ff488e12">
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <nova:port uuid="04107db0-1e00-49d9-8888-dd071f790f24">
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </nova:ports>
Nov 28 20:00:03 np0005539279 nova_compute[187514]: </nova:instance>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <memory unit='KiB'>131072</memory>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <vcpu placement='static'>1</vcpu>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <resource>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <partition>/machine</partition>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </resource>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <sysinfo type='smbios'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <entry name='manufacturer'>RDO</entry>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <entry name='product'>OpenStack Compute</entry>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <entry name='serial'>cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <entry name='uuid'>cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <entry name='family'>Virtual Machine</entry>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <boot dev='hd'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <smbios mode='sysinfo'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <vmcoreinfo state='on'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <cpu mode='custom' match='exact' check='full'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <model fallback='forbid'>EPYC-Rome</model>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <vendor>AMD</vendor>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='x2apic'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='tsc-deadline'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='hypervisor'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='tsc_adjust'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='spec-ctrl'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='stibp'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='ssbd'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='cmp_legacy'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='overflow-recov'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='succor'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='ibrs'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='amd-ssbd'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='virt-ssbd'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='lbrv'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='tsc-scale'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='vmcb-clean'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='flushbyasid'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='pause-filter'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='pfthreshold'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='svme-addr-chk'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='lfence-always-serializing'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='xsaves'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='svm'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='require' name='topoext'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='npt'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <feature policy='disable' name='nrip-save'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <clock offset='utc'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <timer name='pit' tickpolicy='delay'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <timer name='hpet' present='no'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <on_poweroff>destroy</on_poweroff>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <on_reboot>restart</on_reboot>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <on_crash>destroy</on_crash>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <disk type='file' device='disk'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <source file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk' index='2'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <backingStore type='file' index='3'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:        <format type='raw'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:        <source file='/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:        <backingStore/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      </backingStore>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target dev='vda' bus='virtio'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='virtio-disk0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <disk type='file' device='cdrom'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <driver name='qemu' type='raw' cache='none'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <source file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.config' index='1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <backingStore/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target dev='sda' bus='sata'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <readonly/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='sata0-0-0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='0' model='pcie-root'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pcie.0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='1' port='0x10'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='2' port='0x11'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.2'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='3' port='0x12'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.3'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='4' port='0x13'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.4'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='5' port='0x14'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.5'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='6' port='0x15'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.6'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='7' port='0x16'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.7'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='8' port='0x17'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.8'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='9' port='0x18'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.9'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='10' port='0x19'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.10'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='11' port='0x1a'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.11'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='12' port='0x1b'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.12'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='13' port='0x1c'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.13'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='14' port='0x1d'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.14'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='15' port='0x1e'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.15'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='16' port='0x1f'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.16'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='17' port='0x20'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.17'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='18' port='0x21'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.18'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='19' port='0x22'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.19'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='20' port='0x23'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.20'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='21' port='0x24'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.21'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='22' port='0x25'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.22'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='23' port='0x26'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.23'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='24' port='0x27'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.24'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target chassis='25' port='0x28'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.25'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model name='pcie-pci-bridge'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='pci.26'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='usb'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <controller type='sata' index='0'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='ide'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <interface type='ethernet'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <mac address='fa:16:3e:0e:68:b1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target dev='tap619a3ed2-fa'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model type='virtio'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <mtu size='1442'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='net0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <interface type='ethernet'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <mac address='fa:16:3e:9f:7c:e0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target dev='tap04107db0-1e'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model type='virtio'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <mtu size='1442'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='net1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <serial type='pty'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <source path='/dev/pts/0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <log file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log' append='off'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target type='isa-serial' port='0'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:        <model name='isa-serial'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      </target>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='serial0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <console type='pty' tty='/dev/pts/0'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <source path='/dev/pts/0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <log file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log' append='off'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <target type='serial' port='0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='serial0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </console>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <input type='tablet' bus='usb'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='input0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='usb' bus='0' port='1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <input type='mouse' bus='ps2'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='input1'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <input type='keyboard' bus='ps2'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='input2'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <listen type='address' address='::0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </graphics>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <audio id='1' type='none'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <model type='virtio' heads='1' primary='yes'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='video0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <watchdog model='itco' action='reset'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='watchdog0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </watchdog>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <memballoon model='virtio'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <stats period='10'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='balloon0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <rng model='virtio'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <backend model='random'>/dev/urandom</backend>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <alias name='rng0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <label>system_u:system_r:svirt_t:s0:c485,c929</label>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c485,c929</imagelabel>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </seclabel>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <label>+107:+107</label>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:    <imagelabel>+107:+107</imagelabel>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  </seclabel>
Nov 28 20:00:03 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:00:03 np0005539279 nova_compute[187514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.875 187518 INFO nova.virt.libvirt.driver [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully detached device tap04107db0-1e from instance cdaab479-3862-458b-b200-b443c1647c78 from the persistent domain config.#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.876 187518 DEBUG nova.virt.libvirt.driver [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] (1/8): Attempting to detach device tap04107db0-1e with device alias net1 from instance cdaab479-3862-458b-b200-b443c1647c78 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.877 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] detach device xml: <interface type="ethernet">
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <mac address="fa:16:3e:9f:7c:e0"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <model type="virtio"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <driver name="vhost" rx_queue_size="512"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <mtu size="1442"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]:  <target dev="tap04107db0-1e"/>
Nov 28 20:00:03 np0005539279 nova_compute[187514]: </interface>
Nov 28 20:00:03 np0005539279 nova_compute[187514]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 28 20:00:03 np0005539279 kernel: tap04107db0-1e (unregistering): left promiscuous mode
Nov 28 20:00:03 np0005539279 NetworkManager[55703]: <info>  [1764378003.9863] device (tap04107db0-1e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.994 187518 DEBUG nova.virt.libvirt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Received event <DeviceRemovedEvent: 1764378003.9942899, cdaab479-3862-458b-b200-b443c1647c78 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.996 187518 DEBUG nova.virt.libvirt.driver [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Start waiting for the detach event from libvirt for device tap04107db0-1e with device alias net1 for instance cdaab479-3862-458b-b200-b443c1647c78 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 28 20:00:03 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:03Z|00111|binding|INFO|Releasing lport 04107db0-1e00-49d9-8888-dd071f790f24 from this chassis (sb_readonly=0)
Nov 28 20:00:03 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:03Z|00112|binding|INFO|Setting lport 04107db0-1e00-49d9-8888-dd071f790f24 down in Southbound
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.997 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 20:00:03 np0005539279 nova_compute[187514]: 2025-11-29 01:00:03.997 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:03 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:03Z|00113|binding|INFO|Removing iface tap04107db0-1e ovn-installed in OVS
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.001 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.002 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <name>instance-00000006</name>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <uuid>cdaab479-3862-458b-b200-b443c1647c78</uuid>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:name>tempest-TestNetworkBasicOps-server-2097217943</nova:name>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:creationTime>2025-11-29 00:58:32</nova:creationTime>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:flavor name="m1.nano">
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:memory>128</nova:memory>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:disk>1</nova:disk>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:swap>0</nova:swap>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:vcpus>1</nova:vcpus>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </nova:flavor>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:owner>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </nova:owner>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:ports>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:port uuid="619a3ed2-fa55-4d60-8e37-9fd4ff488e12">
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:port uuid="04107db0-1e00-49d9-8888-dd071f790f24">
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </nova:ports>
Nov 28 20:00:04 np0005539279 nova_compute[187514]: </nova:instance>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <memory unit='KiB'>131072</memory>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <vcpu placement='static'>1</vcpu>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <resource>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <partition>/machine</partition>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </resource>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <sysinfo type='smbios'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <entry name='manufacturer'>RDO</entry>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <entry name='product'>OpenStack Compute</entry>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <entry name='serial'>cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <entry name='uuid'>cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <entry name='family'>Virtual Machine</entry>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <boot dev='hd'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <smbios mode='sysinfo'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <vmcoreinfo state='on'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <cpu mode='custom' match='exact' check='full'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <model fallback='forbid'>EPYC-Rome</model>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <vendor>AMD</vendor>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='x2apic'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='tsc-deadline'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='hypervisor'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='tsc_adjust'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='spec-ctrl'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='stibp'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='ssbd'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='cmp_legacy'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='overflow-recov'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='succor'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='ibrs'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='amd-ssbd'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='virt-ssbd'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='lbrv'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='tsc-scale'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='vmcb-clean'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='flushbyasid'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='pause-filter'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='pfthreshold'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='svme-addr-chk'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='lfence-always-serializing'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='xsaves'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='svm'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='require' name='topoext'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='npt'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <feature policy='disable' name='nrip-save'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <clock offset='utc'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <timer name='pit' tickpolicy='delay'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <timer name='hpet' present='no'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <on_poweroff>destroy</on_poweroff>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <on_reboot>restart</on_reboot>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <on_crash>destroy</on_crash>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <disk type='file' device='disk'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <source file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk' index='2'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <backingStore type='file' index='3'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:        <format type='raw'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:        <source file='/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:        <backingStore/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      </backingStore>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target dev='vda' bus='virtio'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='virtio-disk0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <disk type='file' device='cdrom'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <driver name='qemu' type='raw' cache='none'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <source file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.config' index='1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <backingStore/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target dev='sda' bus='sata'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <readonly/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='sata0-0-0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='0' model='pcie-root'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pcie.0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='1' port='0x10'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='2' port='0x11'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.2'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='3' port='0x12'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.3'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='4' port='0x13'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.4'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='5' port='0x14'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.5'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='6' port='0x15'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.6'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='7' port='0x16'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.7'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='8' port='0x17'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.8'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='9' port='0x18'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.9'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='10' port='0x19'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.10'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='11' port='0x1a'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.11'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='12' port='0x1b'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.12'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='13' port='0x1c'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.13'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='14' port='0x1d'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.14'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='15' port='0x1e'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.15'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='16' port='0x1f'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.16'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='17' port='0x20'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.17'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='18' port='0x21'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.18'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='19' port='0x22'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.19'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='20' port='0x23'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.20'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='21' port='0x24'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.21'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='22' port='0x25'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.22'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='23' port='0x26'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.23'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='24' port='0x27'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.24'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target chassis='25' port='0x28'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.25'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model name='pcie-pci-bridge'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='pci.26'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='usb'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <controller type='sata' index='0'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='ide'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <interface type='ethernet'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <mac address='fa:16:3e:0e:68:b1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target dev='tap619a3ed2-fa'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model type='virtio'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <mtu size='1442'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='net0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <serial type='pty'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <source path='/dev/pts/0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <log file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log' append='off'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target type='isa-serial' port='0'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:        <model name='isa-serial'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      </target>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='serial0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <console type='pty' tty='/dev/pts/0'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <source path='/dev/pts/0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <log file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log' append='off'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <target type='serial' port='0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='serial0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </console>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <input type='tablet' bus='usb'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='input0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='usb' bus='0' port='1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <input type='mouse' bus='ps2'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='input1'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <input type='keyboard' bus='ps2'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='input2'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <listen type='address' address='::0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </graphics>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <audio id='1' type='none'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <model type='virtio' heads='1' primary='yes'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='video0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <watchdog model='itco' action='reset'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='watchdog0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </watchdog>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <memballoon model='virtio'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <stats period='10'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='balloon0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <rng model='virtio'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <backend model='random'>/dev/urandom</backend>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <alias name='rng0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <label>system_u:system_r:svirt_t:s0:c485,c929</label>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c485,c929</imagelabel>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </seclabel>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <label>+107:+107</label>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <imagelabel>+107:+107</imagelabel>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </seclabel>
Nov 28 20:00:04 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:00:04 np0005539279 nova_compute[187514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.003 187518 INFO nova.virt.libvirt.driver [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully detached device tap04107db0-1e from instance cdaab479-3862-458b-b200-b443c1647c78 from the live domain config.#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.004 187518 DEBUG nova.virt.libvirt.vif [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:58:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:58:04Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.005 187518 DEBUG nova.network.os_vif_util [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.006 187518 DEBUG nova.network.os_vif_util [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.006 187518 DEBUG os_vif [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.006 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:7c:e0 10.100.0.19', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-beb28e65-81a9-4c61-962b-bcd4d536483d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d853b3f-840e-4869-bcf7-a67e0ea8364c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=04107db0-1e00-49d9-8888-dd071f790f24) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.008 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 04107db0-1e00-49d9-8888-dd071f790f24 in datapath beb28e65-81a9-4c61-962b-bcd4d536483d unbound from our chassis#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.009 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network beb28e65-81a9-4c61-962b-bcd4d536483d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.009 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.010 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04107db0-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.010 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[add0df28-17e7-45dc-8fab-193abcc8a90f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.012 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d namespace which is not needed anymore#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.012 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.015 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.019 187518 INFO os_vif [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e')#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.020 187518 DEBUG nova.virt.libvirt.guest [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:name>tempest-TestNetworkBasicOps-server-2097217943</nova:name>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:creationTime>2025-11-29 01:00:04</nova:creationTime>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:flavor name="m1.nano">
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:memory>128</nova:memory>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:disk>1</nova:disk>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:swap>0</nova:swap>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:vcpus>1</nova:vcpus>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </nova:flavor>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:owner>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </nova:owner>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  <nova:ports>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    <nova:port uuid="619a3ed2-fa55-4d60-8e37-9fd4ff488e12">
Nov 28 20:00:04 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 20:00:04 np0005539279 nova_compute[187514]:  </nova:ports>
Nov 28 20:00:04 np0005539279 nova_compute[187514]: </nova:instance>
Nov 28 20:00:04 np0005539279 nova_compute[187514]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 28 20:00:04 np0005539279 neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d[216958]: [NOTICE]   (216962) : haproxy version is 2.8.14-c23fe91
Nov 28 20:00:04 np0005539279 neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d[216958]: [NOTICE]   (216962) : path to executable is /usr/sbin/haproxy
Nov 28 20:00:04 np0005539279 neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d[216958]: [WARNING]  (216962) : Exiting Master process...
Nov 28 20:00:04 np0005539279 neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d[216958]: [ALERT]    (216962) : Current worker (216964) exited with code 143 (Terminated)
Nov 28 20:00:04 np0005539279 neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d[216958]: [WARNING]  (216962) : All workers exited. Exiting... (0)
Nov 28 20:00:04 np0005539279 systemd[1]: libpod-e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8.scope: Deactivated successfully.
Nov 28 20:00:04 np0005539279 podman[217583]: 2025-11-29 01:00:04.196027125 +0000 UTC m=+0.059948111 container died e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:00:04 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8-userdata-shm.mount: Deactivated successfully.
Nov 28 20:00:04 np0005539279 systemd[1]: var-lib-containers-storage-overlay-cfbea9a12fc84277159a868b18d1d707bd27a2b5bd58affab360cdaf72e4f92e-merged.mount: Deactivated successfully.
Nov 28 20:00:04 np0005539279 podman[217583]: 2025-11-29 01:00:04.248603754 +0000 UTC m=+0.112524710 container cleanup e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 20:00:04 np0005539279 systemd[1]: libpod-conmon-e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8.scope: Deactivated successfully.
Nov 28 20:00:04 np0005539279 podman[217614]: 2025-11-29 01:00:04.364578576 +0000 UTC m=+0.075438731 container remove e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.374 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[a0fd18ae-39a4-43a1-8984-293a827cd646]: (4, ('Sat Nov 29 01:00:04 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d (e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8)\ne907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8\nSat Nov 29 01:00:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d (e907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8)\ne907b3250f701da45c2225f57f73c5f9c5aea3d4c8fa35b1e0eb99b1479535a8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.376 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[daa5faea-4c3c-4bc7-b6e7-8db903a36e61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.377 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbeb28e65-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.379 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:04 np0005539279 kernel: tapbeb28e65-80: left promiscuous mode
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.397 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.401 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2e96ce68-400c-4a1e-b589-d8fc90691a15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.417 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[db607b64-8564-4117-a47a-71c7a3840349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.419 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[5a64649c-b3c6-43c6-b4b8-275d89307574]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.439 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[ed1d29a8-a8b0-40e4-8e3e-dbc03ca55fdb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384656, 'reachable_time': 40772, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217629, 'error': None, 'target': 'ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:04 np0005539279 systemd[1]: run-netns-ovnmeta\x2dbeb28e65\x2d81a9\x2d4c61\x2d962b\x2dbcd4d536483d.mount: Deactivated successfully.
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.444 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-beb28e65-81a9-4c61-962b-bcd4d536483d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 20:00:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:04.445 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[8df99c3f-b782-4f9b-864f-2cacd2b238f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.821 187518 DEBUG oslo_concurrency.lockutils [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.822 187518 DEBUG oslo_concurrency.lockutils [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.823 187518 DEBUG nova.network.neutron [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.931 187518 DEBUG nova.compute.manager [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-deleted-04107db0-1e00-49d9-8888-dd071f790f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.932 187518 INFO nova.compute.manager [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Neutron deleted interface 04107db0-1e00-49d9-8888-dd071f790f24; detaching it from the instance and deleting it from the info cache#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.933 187518 DEBUG nova.network.neutron [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:04 np0005539279 nova_compute[187514]: 2025-11-29 01:00:04.959 187518 DEBUG nova.objects.instance [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lazy-loading 'system_metadata' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.004 187518 DEBUG nova.objects.instance [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lazy-loading 'flavor' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.041 187518 DEBUG nova.virt.libvirt.vif [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:58:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:58:04Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.042 187518 DEBUG nova.network.os_vif_util [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Converting VIF {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.043 187518 DEBUG nova.network.os_vif_util [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.046 187518 DEBUG nova.virt.libvirt.guest [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.050 187518 DEBUG nova.virt.libvirt.guest [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <name>instance-00000006</name>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <uuid>cdaab479-3862-458b-b200-b443c1647c78</uuid>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:name>tempest-TestNetworkBasicOps-server-2097217943</nova:name>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:creationTime>2025-11-29 01:00:04</nova:creationTime>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:flavor name="m1.nano">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:memory>128</nova:memory>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:disk>1</nova:disk>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:swap>0</nova:swap>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:vcpus>1</nova:vcpus>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:flavor>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:owner>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:owner>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:ports>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:port uuid="619a3ed2-fa55-4d60-8e37-9fd4ff488e12">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:ports>
Nov 28 20:00:05 np0005539279 nova_compute[187514]: </nova:instance>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <memory unit='KiB'>131072</memory>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <vcpu placement='static'>1</vcpu>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <resource>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <partition>/machine</partition>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </resource>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <sysinfo type='smbios'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='manufacturer'>RDO</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='product'>OpenStack Compute</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='serial'>cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='uuid'>cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='family'>Virtual Machine</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <boot dev='hd'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <smbios mode='sysinfo'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <vmcoreinfo state='on'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <cpu mode='custom' match='exact' check='full'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <model fallback='forbid'>EPYC-Rome</model>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <vendor>AMD</vendor>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='x2apic'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='tsc-deadline'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='hypervisor'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='tsc_adjust'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='spec-ctrl'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='stibp'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='ssbd'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='cmp_legacy'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='overflow-recov'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='succor'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='ibrs'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='amd-ssbd'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='virt-ssbd'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='lbrv'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='tsc-scale'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='vmcb-clean'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='flushbyasid'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='pause-filter'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='pfthreshold'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='svme-addr-chk'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='lfence-always-serializing'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='xsaves'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='svm'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='topoext'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='npt'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='nrip-save'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <clock offset='utc'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <timer name='pit' tickpolicy='delay'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <timer name='hpet' present='no'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <on_poweroff>destroy</on_poweroff>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <on_reboot>restart</on_reboot>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <on_crash>destroy</on_crash>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <disk type='file' device='disk'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <source file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk' index='2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <backingStore type='file' index='3'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:        <format type='raw'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:        <source file='/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:        <backingStore/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      </backingStore>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target dev='vda' bus='virtio'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='virtio-disk0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <disk type='file' device='cdrom'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <driver name='qemu' type='raw' cache='none'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <source file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.config' index='1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <backingStore/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target dev='sda' bus='sata'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <readonly/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='sata0-0-0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='0' model='pcie-root'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pcie.0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='1' port='0x10'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='2' port='0x11'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='3' port='0x12'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.3'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='4' port='0x13'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.4'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='5' port='0x14'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.5'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='6' port='0x15'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.6'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='7' port='0x16'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.7'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='8' port='0x17'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.8'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='9' port='0x18'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.9'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='10' port='0x19'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.10'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='11' port='0x1a'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.11'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='12' port='0x1b'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.12'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='13' port='0x1c'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.13'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='14' port='0x1d'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.14'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='15' port='0x1e'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.15'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='16' port='0x1f'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.16'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='17' port='0x20'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.17'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='18' port='0x21'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.18'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='19' port='0x22'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.19'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='20' port='0x23'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.20'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='21' port='0x24'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.21'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='22' port='0x25'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.22'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='23' port='0x26'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.23'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='24' port='0x27'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.24'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='25' port='0x28'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.25'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-pci-bridge'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.26'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='usb'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='sata' index='0'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='ide'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <interface type='ethernet'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <mac address='fa:16:3e:0e:68:b1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target dev='tap619a3ed2-fa'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model type='virtio'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <mtu size='1442'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='net0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <serial type='pty'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <source path='/dev/pts/0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <log file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log' append='off'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target type='isa-serial' port='0'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:        <model name='isa-serial'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      </target>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='serial0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <console type='pty' tty='/dev/pts/0'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <source path='/dev/pts/0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <log file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log' append='off'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target type='serial' port='0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='serial0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </console>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <input type='tablet' bus='usb'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='input0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='usb' bus='0' port='1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <input type='mouse' bus='ps2'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='input1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <input type='keyboard' bus='ps2'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='input2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <listen type='address' address='::0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </graphics>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <audio id='1' type='none'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model type='virtio' heads='1' primary='yes'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='video0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <watchdog model='itco' action='reset'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='watchdog0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </watchdog>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <memballoon model='virtio'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <stats period='10'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='balloon0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <rng model='virtio'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <backend model='random'>/dev/urandom</backend>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='rng0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <label>system_u:system_r:svirt_t:s0:c485,c929</label>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c485,c929</imagelabel>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </seclabel>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <label>+107:+107</label>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <imagelabel>+107:+107</imagelabel>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </seclabel>
Nov 28 20:00:05 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:00:05 np0005539279 nova_compute[187514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.052 187518 DEBUG nova.virt.libvirt.guest [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.058 187518 DEBUG nova.virt.libvirt.guest [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:9f:7c:e0"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap04107db0-1e"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <name>instance-00000006</name>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <uuid>cdaab479-3862-458b-b200-b443c1647c78</uuid>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:name>tempest-TestNetworkBasicOps-server-2097217943</nova:name>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:creationTime>2025-11-29 01:00:04</nova:creationTime>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:flavor name="m1.nano">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:memory>128</nova:memory>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:disk>1</nova:disk>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:swap>0</nova:swap>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:vcpus>1</nova:vcpus>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:flavor>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:owner>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:owner>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:ports>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:port uuid="619a3ed2-fa55-4d60-8e37-9fd4ff488e12">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:ports>
Nov 28 20:00:05 np0005539279 nova_compute[187514]: </nova:instance>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <memory unit='KiB'>131072</memory>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <vcpu placement='static'>1</vcpu>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <resource>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <partition>/machine</partition>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </resource>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <sysinfo type='smbios'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='manufacturer'>RDO</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='product'>OpenStack Compute</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='serial'>cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='uuid'>cdaab479-3862-458b-b200-b443c1647c78</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <entry name='family'>Virtual Machine</entry>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <boot dev='hd'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <smbios mode='sysinfo'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <vmcoreinfo state='on'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <cpu mode='custom' match='exact' check='full'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <model fallback='forbid'>EPYC-Rome</model>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <vendor>AMD</vendor>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='x2apic'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='tsc-deadline'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='hypervisor'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='tsc_adjust'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='spec-ctrl'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='stibp'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='ssbd'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='cmp_legacy'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='overflow-recov'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='succor'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='ibrs'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='amd-ssbd'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='virt-ssbd'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='lbrv'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='tsc-scale'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='vmcb-clean'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='flushbyasid'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='pause-filter'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='pfthreshold'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='svme-addr-chk'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='lfence-always-serializing'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='xsaves'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='svm'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='require' name='topoext'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='npt'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <feature policy='disable' name='nrip-save'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <clock offset='utc'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <timer name='pit' tickpolicy='delay'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <timer name='hpet' present='no'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <on_poweroff>destroy</on_poweroff>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <on_reboot>restart</on_reboot>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <on_crash>destroy</on_crash>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <disk type='file' device='disk'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <source file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk' index='2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <backingStore type='file' index='3'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:        <format type='raw'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:        <source file='/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:        <backingStore/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      </backingStore>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target dev='vda' bus='virtio'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='virtio-disk0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <disk type='file' device='cdrom'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <driver name='qemu' type='raw' cache='none'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <source file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/disk.config' index='1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <backingStore/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target dev='sda' bus='sata'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <readonly/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='sata0-0-0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='0' model='pcie-root'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pcie.0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='1' port='0x10'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='2' port='0x11'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='3' port='0x12'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.3'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='4' port='0x13'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.4'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='5' port='0x14'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.5'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='6' port='0x15'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.6'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='7' port='0x16'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.7'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='8' port='0x17'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.8'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='9' port='0x18'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.9'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='10' port='0x19'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.10'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='11' port='0x1a'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.11'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='12' port='0x1b'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.12'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='13' port='0x1c'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.13'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='14' port='0x1d'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.14'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='15' port='0x1e'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.15'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='16' port='0x1f'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.16'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='17' port='0x20'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.17'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='18' port='0x21'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.18'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='19' port='0x22'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.19'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='20' port='0x23'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.20'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='21' port='0x24'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.21'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='22' port='0x25'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.22'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='23' port='0x26'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.23'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='24' port='0x27'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.24'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-root-port'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target chassis='25' port='0x28'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.25'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model name='pcie-pci-bridge'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='pci.26'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='usb'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <controller type='sata' index='0'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='ide'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </controller>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <interface type='ethernet'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <mac address='fa:16:3e:0e:68:b1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target dev='tap619a3ed2-fa'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model type='virtio'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <mtu size='1442'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='net0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <serial type='pty'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <source path='/dev/pts/0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <log file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log' append='off'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target type='isa-serial' port='0'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:        <model name='isa-serial'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      </target>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='serial0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <console type='pty' tty='/dev/pts/0'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <source path='/dev/pts/0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <log file='/var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78/console.log' append='off'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <target type='serial' port='0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='serial0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </console>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <input type='tablet' bus='usb'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='input0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='usb' bus='0' port='1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <input type='mouse' bus='ps2'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='input1'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <input type='keyboard' bus='ps2'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='input2'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </input>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <listen type='address' address='::0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </graphics>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <audio id='1' type='none'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <model type='virtio' heads='1' primary='yes'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='video0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <watchdog model='itco' action='reset'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='watchdog0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </watchdog>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <memballoon model='virtio'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <stats period='10'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='balloon0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <rng model='virtio'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <backend model='random'>/dev/urandom</backend>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <alias name='rng0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <label>system_u:system_r:svirt_t:s0:c485,c929</label>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c485,c929</imagelabel>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </seclabel>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <label>+107:+107</label>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <imagelabel>+107:+107</imagelabel>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </seclabel>
Nov 28 20:00:05 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:00:05 np0005539279 nova_compute[187514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.058 187518 WARNING nova.virt.libvirt.driver [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Detaching interface fa:16:3e:9f:7c:e0 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap04107db0-1e' not found.#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.060 187518 DEBUG nova.virt.libvirt.vif [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:58:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:58:04Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.060 187518 DEBUG nova.network.os_vif_util [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Converting VIF {"id": "04107db0-1e00-49d9-8888-dd071f790f24", "address": "fa:16:3e:9f:7c:e0", "network": {"id": "beb28e65-81a9-4c61-962b-bcd4d536483d", "bridge": "br-int", "label": "tempest-network-smoke--1563238225", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04107db0-1e", "ovs_interfaceid": "04107db0-1e00-49d9-8888-dd071f790f24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.061 187518 DEBUG nova.network.os_vif_util [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.062 187518 DEBUG os_vif [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.064 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.064 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04107db0-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.065 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.068 187518 INFO os_vif [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:7c:e0,bridge_name='br-int',has_traffic_filtering=True,id=04107db0-1e00-49d9-8888-dd071f790f24,network=Network(beb28e65-81a9-4c61-962b-bcd4d536483d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04107db0-1e')#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.068 187518 DEBUG nova.virt.libvirt.guest [req-4e07139e-e9c5-44a3-a7cc-bf980196bc48 req-ccb05333-031d-4c13-84d1-61a74c808ddb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:name>tempest-TestNetworkBasicOps-server-2097217943</nova:name>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:creationTime>2025-11-29 01:00:05</nova:creationTime>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:flavor name="m1.nano">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:memory>128</nova:memory>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:disk>1</nova:disk>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:swap>0</nova:swap>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:vcpus>1</nova:vcpus>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:flavor>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:owner>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:owner>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  <nova:ports>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    <nova:port uuid="619a3ed2-fa55-4d60-8e37-9fd4ff488e12">
Nov 28 20:00:05 np0005539279 nova_compute[187514]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:    </nova:port>
Nov 28 20:00:05 np0005539279 nova_compute[187514]:  </nova:ports>
Nov 28 20:00:05 np0005539279 nova_compute[187514]: </nova:instance>
Nov 28 20:00:05 np0005539279 nova_compute[187514]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.858 187518 DEBUG nova.compute.manager [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-unplugged-04107db0-1e00-49d9-8888-dd071f790f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.858 187518 DEBUG oslo_concurrency.lockutils [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.858 187518 DEBUG oslo_concurrency.lockutils [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.859 187518 DEBUG oslo_concurrency.lockutils [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.859 187518 DEBUG nova.compute.manager [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] No waiting events found dispatching network-vif-unplugged-04107db0-1e00-49d9-8888-dd071f790f24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.859 187518 WARNING nova.compute.manager [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received unexpected event network-vif-unplugged-04107db0-1e00-49d9-8888-dd071f790f24 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.859 187518 DEBUG nova.compute.manager [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.860 187518 DEBUG oslo_concurrency.lockutils [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.860 187518 DEBUG oslo_concurrency.lockutils [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.860 187518 DEBUG oslo_concurrency.lockutils [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.860 187518 DEBUG nova.compute.manager [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] No waiting events found dispatching network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:05 np0005539279 nova_compute[187514]: 2025-11-29 01:00:05.860 187518 WARNING nova.compute.manager [req-336a3cb4-a061-4ac0-9afd-b1c31cab491c req-345318a4-dd4d-4f25-ad40-bfc84d28e644 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received unexpected event network-vif-plugged-04107db0-1e00-49d9-8888-dd071f790f24 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:00:06 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:06Z|00114|binding|INFO|Releasing lport 1e718df3-210b-4d88-80e9-df977e4844c7 from this chassis (sb_readonly=0)
Nov 28 20:00:06 np0005539279 nova_compute[187514]: 2025-11-29 01:00:06.763 187518 INFO nova.network.neutron [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Port 04107db0-1e00-49d9-8888-dd071f790f24 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 28 20:00:06 np0005539279 nova_compute[187514]: 2025-11-29 01:00:06.764 187518 DEBUG nova.network.neutron [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:06 np0005539279 nova_compute[187514]: 2025-11-29 01:00:06.829 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:06 np0005539279 nova_compute[187514]: 2025-11-29 01:00:06.855 187518 DEBUG oslo_concurrency.lockutils [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:00:06 np0005539279 nova_compute[187514]: 2025-11-29 01:00:06.906 187518 DEBUG oslo_concurrency.lockutils [None req-fe4d54ec-d71d-4d53-91f2-ebd8f05adb29 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "interface-cdaab479-3862-458b-b200-b443c1647c78-04107db0-1e00-49d9-8888-dd071f790f24" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:07 np0005539279 nova_compute[187514]: 2025-11-29 01:00:07.549 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:08.094 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:08.095 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:08.096 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.574 187518 DEBUG nova.compute.manager [req-f80637e6-7479-4a43-9540-548f5fc5022d req-b50be03a-d70f-40d5-b493-5bec56f743e4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-changed-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.575 187518 DEBUG nova.compute.manager [req-f80637e6-7479-4a43-9540-548f5fc5022d req-b50be03a-d70f-40d5-b493-5bec56f743e4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing instance network info cache due to event network-changed-619a3ed2-fa55-4d60-8e37-9fd4ff488e12. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.575 187518 DEBUG oslo_concurrency.lockutils [req-f80637e6-7479-4a43-9540-548f5fc5022d req-b50be03a-d70f-40d5-b493-5bec56f743e4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.576 187518 DEBUG oslo_concurrency.lockutils [req-f80637e6-7479-4a43-9540-548f5fc5022d req-b50be03a-d70f-40d5-b493-5bec56f743e4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.576 187518 DEBUG nova.network.neutron [req-f80637e6-7479-4a43-9540-548f5fc5022d req-b50be03a-d70f-40d5-b493-5bec56f743e4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Refreshing network info cache for port 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.706 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.706 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.707 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.707 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.708 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.710 187518 INFO nova.compute.manager [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Terminating instance#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.711 187518 DEBUG nova.compute.manager [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 20:00:08 np0005539279 kernel: tap619a3ed2-fa (unregistering): left promiscuous mode
Nov 28 20:00:08 np0005539279 NetworkManager[55703]: <info>  [1764378008.7448] device (tap619a3ed2-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.754 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:08 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:08Z|00115|binding|INFO|Releasing lport 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 from this chassis (sb_readonly=0)
Nov 28 20:00:08 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:08Z|00116|binding|INFO|Setting lport 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 down in Southbound
Nov 28 20:00:08 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:08Z|00117|binding|INFO|Removing iface tap619a3ed2-fa ovn-installed in OVS
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.759 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:08.765 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:68:b1 10.100.0.4'], port_security=['fa:16:3e:0e:68:b1 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'cdaab479-3862-458b-b200-b443c1647c78', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-772dc02e-f97e-4f35-bbad-0f0f22357164', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c188716c-76b9-447e-b8e8-521d447349a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef1e45ce-2254-4684-b7ba-97523ff379ec, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=619a3ed2-fa55-4d60-8e37-9fd4ff488e12) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:00:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:08.766 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 619a3ed2-fa55-4d60-8e37-9fd4ff488e12 in datapath 772dc02e-f97e-4f35-bbad-0f0f22357164 unbound from our chassis#033[00m
Nov 28 20:00:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:08.767 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 772dc02e-f97e-4f35-bbad-0f0f22357164, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 20:00:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:08.768 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[509128d8-60bf-4beb-a606-3f62174170d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:08.769 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164 namespace which is not needed anymore#033[00m
Nov 28 20:00:08 np0005539279 nova_compute[187514]: 2025-11-29 01:00:08.776 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:08 np0005539279 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 28 20:00:08 np0005539279 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 18.937s CPU time.
Nov 28 20:00:08 np0005539279 systemd-machined[153752]: Machine qemu-6-instance-00000006 terminated.
Nov 28 20:00:08 np0005539279 neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164[216678]: [NOTICE]   (216682) : haproxy version is 2.8.14-c23fe91
Nov 28 20:00:08 np0005539279 neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164[216678]: [NOTICE]   (216682) : path to executable is /usr/sbin/haproxy
Nov 28 20:00:08 np0005539279 neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164[216678]: [WARNING]  (216682) : Exiting Master process...
Nov 28 20:00:08 np0005539279 neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164[216678]: [ALERT]    (216682) : Current worker (216684) exited with code 143 (Terminated)
Nov 28 20:00:08 np0005539279 neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164[216678]: [WARNING]  (216682) : All workers exited. Exiting... (0)
Nov 28 20:00:08 np0005539279 systemd[1]: libpod-b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe.scope: Deactivated successfully.
Nov 28 20:00:08 np0005539279 conmon[216678]: conmon b7b8a81d2381651a8277 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe.scope/container/memory.events
Nov 28 20:00:08 np0005539279 podman[217656]: 2025-11-29 01:00:08.964792884 +0000 UTC m=+0.066029896 container died b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 20:00:09 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe-userdata-shm.mount: Deactivated successfully.
Nov 28 20:00:09 np0005539279 systemd[1]: var-lib-containers-storage-overlay-c2cae2d4a01427ecee6ddaeb52092a8f9fc436a86400ce83c23f4f4bbcbe7a61-merged.mount: Deactivated successfully.
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.007 187518 INFO nova.virt.libvirt.driver [-] [instance: cdaab479-3862-458b-b200-b443c1647c78] Instance destroyed successfully.#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.009 187518 DEBUG nova.objects.instance [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid cdaab479-3862-458b-b200-b443c1647c78 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.012 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:09 np0005539279 podman[217656]: 2025-11-29 01:00:09.014500035 +0000 UTC m=+0.115737007 container cleanup b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:00:09 np0005539279 systemd[1]: libpod-conmon-b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe.scope: Deactivated successfully.
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.024 187518 DEBUG nova.virt.libvirt.vif [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T00:57:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2097217943',display_name='tempest-TestNetworkBasicOps-server-2097217943',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2097217943',id=6,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF910c7AbYs9dNjjBv3LjPc0J2B8QXYVJQTmIInU8lruARIboYDohKwEgeUYIOY0BzJik1EkH3h93U5lAz+8MC4WBOvGiE7MsVZAGGrudukAJjuE3vx7N1YN0Do/RHWTHw==',key_name='tempest-TestNetworkBasicOps-2147300220',keypairs=<?>,launch_index=0,launched_at=2025-11-29T00:58:04Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-glhr20uf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T00:58:04Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=cdaab479-3862-458b-b200-b443c1647c78,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.025 187518 DEBUG nova.network.os_vif_util [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.026 187518 DEBUG nova.network.os_vif_util [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0e:68:b1,bridge_name='br-int',has_traffic_filtering=True,id=619a3ed2-fa55-4d60-8e37-9fd4ff488e12,network=Network(772dc02e-f97e-4f35-bbad-0f0f22357164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap619a3ed2-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.026 187518 DEBUG os_vif [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:68:b1,bridge_name='br-int',has_traffic_filtering=True,id=619a3ed2-fa55-4d60-8e37-9fd4ff488e12,network=Network(772dc02e-f97e-4f35-bbad-0f0f22357164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap619a3ed2-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.029 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.029 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap619a3ed2-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.073 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.075 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.079 187518 INFO os_vif [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:68:b1,bridge_name='br-int',has_traffic_filtering=True,id=619a3ed2-fa55-4d60-8e37-9fd4ff488e12,network=Network(772dc02e-f97e-4f35-bbad-0f0f22357164),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap619a3ed2-fa')#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.080 187518 INFO nova.virt.libvirt.driver [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Deleting instance files /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78_del#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.081 187518 INFO nova.virt.libvirt.driver [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Deletion of /var/lib/nova/instances/cdaab479-3862-458b-b200-b443c1647c78_del complete#033[00m
Nov 28 20:00:09 np0005539279 podman[217700]: 2025-11-29 01:00:09.108618223 +0000 UTC m=+0.060995119 container remove b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.117 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[968ea3a3-d3cf-42f8-a9a7-4fdd6719d2d7]: (4, ('Sat Nov 29 01:00:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164 (b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe)\nb7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe\nSat Nov 29 01:00:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164 (b7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe)\nb7b8a81d2381651a82779198a7e1c60c8bc5611446fdf4afe6e9f6b7e6f0b2fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.120 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e49c389b-7715-49e4-ae3c-7b1758382360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.121 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap772dc02e-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.124 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.138 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:09 np0005539279 kernel: tap772dc02e-f0: left promiscuous mode
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.140 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.143 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ac7cee-ac74-4807-8425-d56112e2a04c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.150 187518 INFO nova.compute.manager [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Took 0.44 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.150 187518 DEBUG oslo.service.loopingcall [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.151 187518 DEBUG nova.compute.manager [-] [instance: cdaab479-3862-458b-b200-b443c1647c78] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.151 187518 DEBUG nova.network.neutron [-] [instance: cdaab479-3862-458b-b200-b443c1647c78] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.158 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe6cf9d-fdd5-46d1-a24b-18432dcb3539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.159 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[caaace26-ea7b-4009-a7c6-8db351c692e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.181 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[94efb246-4464-41d7-95ed-6b624c38947a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381780, 'reachable_time': 42915, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217713, 'error': None, 'target': 'ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.184 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-772dc02e-f97e-4f35-bbad-0f0f22357164 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 20:00:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:09.184 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[51373aae-7394-4501-a165-9e22cbfc647b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:09 np0005539279 systemd[1]: run-netns-ovnmeta\x2d772dc02e\x2df97e\x2d4f35\x2dbbad\x2d0f0f22357164.mount: Deactivated successfully.
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.869 187518 DEBUG nova.network.neutron [-] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.894 187518 INFO nova.compute.manager [-] [instance: cdaab479-3862-458b-b200-b443c1647c78] Took 0.74 seconds to deallocate network for instance.#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.914 187518 DEBUG nova.compute.manager [req-49d9c14d-18f6-46ff-a878-84bbf7e73ebe req-3ec16114-105e-4c5e-a51d-e53ef5e8125f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-deleted-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.995 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:09 np0005539279 nova_compute[187514]: 2025-11-29 01:00:09.996 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.069 187518 DEBUG nova.compute.provider_tree [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.090 187518 DEBUG nova.scheduler.client.report [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.217 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.239 187518 INFO nova.scheduler.client.report [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance cdaab479-3862-458b-b200-b443c1647c78#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.323 187518 DEBUG nova.network.neutron [req-f80637e6-7479-4a43-9540-548f5fc5022d req-b50be03a-d70f-40d5-b493-5bec56f743e4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updated VIF entry in instance network info cache for port 619a3ed2-fa55-4d60-8e37-9fd4ff488e12. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.324 187518 DEBUG nova.network.neutron [req-f80637e6-7479-4a43-9540-548f5fc5022d req-b50be03a-d70f-40d5-b493-5bec56f743e4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Updating instance_info_cache with network_info: [{"id": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "address": "fa:16:3e:0e:68:b1", "network": {"id": "772dc02e-f97e-4f35-bbad-0f0f22357164", "bridge": "br-int", "label": "tempest-network-smoke--1422475054", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap619a3ed2-fa", "ovs_interfaceid": "619a3ed2-fa55-4d60-8e37-9fd4ff488e12", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.344 187518 DEBUG oslo_concurrency.lockutils [None req-441a6c01-6a5a-41db-9772-a7ec15a94ace 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.348 187518 DEBUG oslo_concurrency.lockutils [req-f80637e6-7479-4a43-9540-548f5fc5022d req-b50be03a-d70f-40d5-b493-5bec56f743e4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-cdaab479-3862-458b-b200-b443c1647c78" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.676 187518 DEBUG nova.compute.manager [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-unplugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.676 187518 DEBUG oslo_concurrency.lockutils [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.677 187518 DEBUG oslo_concurrency.lockutils [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.677 187518 DEBUG oslo_concurrency.lockutils [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.677 187518 DEBUG nova.compute.manager [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] No waiting events found dispatching network-vif-unplugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.677 187518 WARNING nova.compute.manager [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received unexpected event network-vif-unplugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.677 187518 DEBUG nova.compute.manager [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received event network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.678 187518 DEBUG oslo_concurrency.lockutils [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "cdaab479-3862-458b-b200-b443c1647c78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.678 187518 DEBUG oslo_concurrency.lockutils [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.678 187518 DEBUG oslo_concurrency.lockutils [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "cdaab479-3862-458b-b200-b443c1647c78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.678 187518 DEBUG nova.compute.manager [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] No waiting events found dispatching network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:10 np0005539279 nova_compute[187514]: 2025-11-29 01:00:10.678 187518 WARNING nova.compute.manager [req-a9c0efb9-041b-481a-8a00-f7a25b04f12b req-a59c5b9a-6192-4a89-a0e7-9f0f80e5e071 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: cdaab479-3862-458b-b200-b443c1647c78] Received unexpected event network-vif-plugged-619a3ed2-fa55-4d60-8e37-9fd4ff488e12 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 20:00:12 np0005539279 nova_compute[187514]: 2025-11-29 01:00:12.551 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:13 np0005539279 nova_compute[187514]: 2025-11-29 01:00:13.133 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:13 np0005539279 nova_compute[187514]: 2025-11-29 01:00:13.236 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:14 np0005539279 nova_compute[187514]: 2025-11-29 01:00:14.073 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:15 np0005539279 nova_compute[187514]: 2025-11-29 01:00:15.835 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764378000.8346183, 375250f0-4625-4017-ac44-e74799c55dbf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:15 np0005539279 nova_compute[187514]: 2025-11-29 01:00:15.836 187518 INFO nova.compute.manager [-] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] VM Stopped (Lifecycle Event)#033[00m
Nov 28 20:00:15 np0005539279 nova_compute[187514]: 2025-11-29 01:00:15.884 187518 DEBUG nova.compute.manager [None req-36140f5f-1942-44cc-afb1-6aa590e17136 - - - - - -] [instance: 375250f0-4625-4017-ac44-e74799c55dbf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:16 np0005539279 podman[217720]: 2025-11-29 01:00:16.348013525 +0000 UTC m=+0.091985431 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 20:00:16 np0005539279 podman[217721]: 2025-11-29 01:00:16.358353016 +0000 UTC m=+0.102594139 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:00:16 np0005539279 podman[217719]: 2025-11-29 01:00:16.372978654 +0000 UTC m=+0.125369299 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 20:00:17 np0005539279 nova_compute[187514]: 2025-11-29 01:00:17.552 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:19 np0005539279 nova_compute[187514]: 2025-11-29 01:00:19.076 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:21 np0005539279 nova_compute[187514]: 2025-11-29 01:00:21.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.555 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.643 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.644 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.644 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.645 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.921 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.923 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5764MB free_disk=73.33942413330078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.924 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.924 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.992 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:00:22 np0005539279 nova_compute[187514]: 2025-11-29 01:00:22.993 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:00:23 np0005539279 nova_compute[187514]: 2025-11-29 01:00:23.023 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:00:23 np0005539279 nova_compute[187514]: 2025-11-29 01:00:23.040 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:00:23 np0005539279 nova_compute[187514]: 2025-11-29 01:00:23.066 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:00:23 np0005539279 nova_compute[187514]: 2025-11-29 01:00:23.067 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:24 np0005539279 nova_compute[187514]: 2025-11-29 01:00:24.003 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764378009.001821, cdaab479-3862-458b-b200-b443c1647c78 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:24 np0005539279 nova_compute[187514]: 2025-11-29 01:00:24.004 187518 INFO nova.compute.manager [-] [instance: cdaab479-3862-458b-b200-b443c1647c78] VM Stopped (Lifecycle Event)#033[00m
Nov 28 20:00:24 np0005539279 nova_compute[187514]: 2025-11-29 01:00:24.025 187518 DEBUG nova.compute.manager [None req-c82a0b25-fd98-49e4-99ef-59979d32370c - - - - - -] [instance: cdaab479-3862-458b-b200-b443c1647c78] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:24 np0005539279 nova_compute[187514]: 2025-11-29 01:00:24.080 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:26 np0005539279 nova_compute[187514]: 2025-11-29 01:00:26.069 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:26 np0005539279 nova_compute[187514]: 2025-11-29 01:00:26.069 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:00:26 np0005539279 nova_compute[187514]: 2025-11-29 01:00:26.097 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:00:26 np0005539279 nova_compute[187514]: 2025-11-29 01:00:26.098 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:26 np0005539279 nova_compute[187514]: 2025-11-29 01:00:26.099 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:26 np0005539279 nova_compute[187514]: 2025-11-29 01:00:26.099 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:00:26 np0005539279 nova_compute[187514]: 2025-11-29 01:00:26.635 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:27 np0005539279 nova_compute[187514]: 2025-11-29 01:00:27.558 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.084 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:29 np0005539279 podman[217784]: 2025-11-29 01:00:29.159994634 +0000 UTC m=+0.072893552 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 20:00:29 np0005539279 podman[217785]: 2025-11-29 01:00:29.163118389 +0000 UTC m=+0.070106946 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.216 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "8b1a6d47-bfdb-471f-8cde-41890ea21904" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.217 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.232 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.298 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.299 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.310 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.311 187518 INFO nova.compute.claims [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.490 187518 DEBUG nova.compute.provider_tree [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.515 187518 DEBUG nova.scheduler.client.report [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.544 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.545 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.594 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.595 187518 DEBUG nova.network.neutron [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.618 187518 INFO nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.637 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.751 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.754 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.755 187518 INFO nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Creating image(s)#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.756 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.757 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.759 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.784 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.880 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.881 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.883 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.909 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.996 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:29 np0005539279 nova_compute[187514]: 2025-11-29 01:00:29.998 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.070 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk 1073741824" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.071 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.072 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.154 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.156 187518 DEBUG nova.virt.disk.api [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.157 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.236 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.238 187518 DEBUG nova.virt.disk.api [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.238 187518 DEBUG nova.objects.instance [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 8b1a6d47-bfdb-471f-8cde-41890ea21904 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.261 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.261 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Ensure instance console log exists: /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.262 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.263 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.264 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.590 187518 DEBUG nova.policy [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:30 np0005539279 nova_compute[187514]: 2025-11-29 01:00:30.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:31 np0005539279 nova_compute[187514]: 2025-11-29 01:00:31.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.324 187518 DEBUG nova.network.neutron [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Successfully updated port: 895d0a7c-e1eb-4602-8b34-4390b3fca106 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.341 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.342 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.342 187518 DEBUG nova.network.neutron [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.475 187518 DEBUG nova.compute.manager [req-430a9e79-0feb-451e-ab4f-91ee3315d5a9 req-08a1a818-bf22-4686-8267-f2687924ce0f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received event network-changed-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.475 187518 DEBUG nova.compute.manager [req-430a9e79-0feb-451e-ab4f-91ee3315d5a9 req-08a1a818-bf22-4686-8267-f2687924ce0f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Refreshing instance network info cache due to event network-changed-895d0a7c-e1eb-4602-8b34-4390b3fca106. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.475 187518 DEBUG oslo_concurrency.lockutils [req-430a9e79-0feb-451e-ab4f-91ee3315d5a9 req-08a1a818-bf22-4686-8267-f2687924ce0f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.507 187518 DEBUG nova.network.neutron [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 20:00:32 np0005539279 nova_compute[187514]: 2025-11-29 01:00:32.560 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:32 np0005539279 podman[217846]: 2025-11-29 01:00:32.850373111 +0000 UTC m=+0.086842741 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 20:00:32 np0005539279 podman[217845]: 2025-11-29 01:00:32.925922305 +0000 UTC m=+0.166820715 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.611 187518 DEBUG nova.network.neutron [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Updating instance_info_cache with network_info: [{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.641 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.641 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Instance network_info: |[{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.642 187518 DEBUG oslo_concurrency.lockutils [req-430a9e79-0feb-451e-ab4f-91ee3315d5a9 req-08a1a818-bf22-4686-8267-f2687924ce0f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.642 187518 DEBUG nova.network.neutron [req-430a9e79-0feb-451e-ab4f-91ee3315d5a9 req-08a1a818-bf22-4686-8267-f2687924ce0f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Refreshing network info cache for port 895d0a7c-e1eb-4602-8b34-4390b3fca106 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.648 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Start _get_guest_xml network_info=[{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.657 187518 WARNING nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.662 187518 DEBUG nova.virt.libvirt.host [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.664 187518 DEBUG nova.virt.libvirt.host [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.674 187518 DEBUG nova.virt.libvirt.host [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.675 187518 DEBUG nova.virt.libvirt.host [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.675 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.676 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.677 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.677 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.678 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.678 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.679 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.679 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.680 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.680 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.681 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.681 187518 DEBUG nova.virt.hardware [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.688 187518 DEBUG nova.virt.libvirt.vif [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1875875969',display_name='tempest-TestNetworkBasicOps-server-1875875969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1875875969',id=8,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPWO8BIvPMdMW5mPpvy28duxybDjRtZs9spq/TXFqpqIPX+ORupsZKlEA+FcoqTq9lUg4Coh6oFUnatUrdOZTY3dueNHRFB+Sk2YjeUMQCjsIz5vAE8TA3pDtRzbrcbFUg==',key_name='tempest-TestNetworkBasicOps-340010307',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-7y7vsnpg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:00:29Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=8b1a6d47-bfdb-471f-8cde-41890ea21904,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.688 187518 DEBUG nova.network.os_vif_util [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.690 187518 DEBUG nova.network.os_vif_util [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.692 187518 DEBUG nova.objects.instance [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8b1a6d47-bfdb-471f-8cde-41890ea21904 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.710 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] End _get_guest_xml xml=<domain type="kvm">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <uuid>8b1a6d47-bfdb-471f-8cde-41890ea21904</uuid>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <name>instance-00000008</name>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-1875875969</nova:name>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 01:00:33</nova:creationTime>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        <nova:port uuid="895d0a7c-e1eb-4602-8b34-4390b3fca106">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <entry name="serial">8b1a6d47-bfdb-471f-8cde-41890ea21904</entry>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <entry name="uuid">8b1a6d47-bfdb-471f-8cde-41890ea21904</entry>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk.config"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:10:52:0c"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <target dev="tap895d0a7c-e1"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/console.log" append="off"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 20:00:33 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:00:33 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:00:33 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:00:33 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.712 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Preparing to wait for external event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.712 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.713 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.713 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.714 187518 DEBUG nova.virt.libvirt.vif [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1875875969',display_name='tempest-TestNetworkBasicOps-server-1875875969',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1875875969',id=8,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPWO8BIvPMdMW5mPpvy28duxybDjRtZs9spq/TXFqpqIPX+ORupsZKlEA+FcoqTq9lUg4Coh6oFUnatUrdOZTY3dueNHRFB+Sk2YjeUMQCjsIz5vAE8TA3pDtRzbrcbFUg==',key_name='tempest-TestNetworkBasicOps-340010307',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-7y7vsnpg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:00:29Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=8b1a6d47-bfdb-471f-8cde-41890ea21904,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.715 187518 DEBUG nova.network.os_vif_util [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.716 187518 DEBUG nova.network.os_vif_util [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.716 187518 DEBUG os_vif [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.717 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.718 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.718 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.723 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.723 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap895d0a7c-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.724 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap895d0a7c-e1, col_values=(('external_ids', {'iface-id': '895d0a7c-e1eb-4602-8b34-4390b3fca106', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:52:0c', 'vm-uuid': '8b1a6d47-bfdb-471f-8cde-41890ea21904'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.726 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.729 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:00:33 np0005539279 NetworkManager[55703]: <info>  [1764378033.7299] manager: (tap895d0a7c-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.738 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.741 187518 INFO os_vif [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1')#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.798 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.799 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.800 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:10:52:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 20:00:33 np0005539279 nova_compute[187514]: 2025-11-29 01:00:33.800 187518 INFO nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Using config drive#033[00m
Nov 28 20:00:34 np0005539279 nova_compute[187514]: 2025-11-29 01:00:34.644 187518 INFO nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Creating config drive at /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk.config#033[00m
Nov 28 20:00:34 np0005539279 nova_compute[187514]: 2025-11-29 01:00:34.654 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcpjhpot9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:34 np0005539279 nova_compute[187514]: 2025-11-29 01:00:34.795 187518 DEBUG oslo_concurrency.processutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcpjhpot9" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:34 np0005539279 kernel: tap895d0a7c-e1: entered promiscuous mode
Nov 28 20:00:34 np0005539279 NetworkManager[55703]: <info>  [1764378034.8889] manager: (tap895d0a7c-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Nov 28 20:00:34 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:34Z|00118|binding|INFO|Claiming lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 for this chassis.
Nov 28 20:00:34 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:34Z|00119|binding|INFO|895d0a7c-e1eb-4602-8b34-4390b3fca106: Claiming fa:16:3e:10:52:0c 10.100.0.11
Nov 28 20:00:34 np0005539279 nova_compute[187514]: 2025-11-29 01:00:34.890 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:34 np0005539279 nova_compute[187514]: 2025-11-29 01:00:34.904 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.929 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:52:0c 10.100.0.11'], port_security=['fa:16:3e:10:52:0c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-919928194', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8b1a6d47-bfdb-471f-8cde-41890ea21904', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99d13d3d-182f-48e0-a407-8e8368320207', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-919928194', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9bd29a3-ae46-41d8-aaea-3325e1bc2031', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a645aae-dbb9-4dd0-9a66-afe665407f03, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=895d0a7c-e1eb-4602-8b34-4390b3fca106) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.932 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 895d0a7c-e1eb-4602-8b34-4390b3fca106 in datapath 99d13d3d-182f-48e0-a407-8e8368320207 bound to our chassis#033[00m
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.934 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99d13d3d-182f-48e0-a407-8e8368320207#033[00m
Nov 28 20:00:34 np0005539279 systemd-machined[153752]: New machine qemu-8-instance-00000008.
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.953 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b274350f-b55b-4738-8bd4-8a93d67524ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.954 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap99d13d3d-11 in ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.956 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap99d13d3d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.956 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6e3b55-3876-4abc-afe3-e20783493a09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.957 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[8101b196-16d1-460a-878c-6c3e02013d64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:34 np0005539279 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Nov 28 20:00:34 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:34Z|00120|binding|INFO|Setting lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 ovn-installed in OVS
Nov 28 20:00:34 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:34Z|00121|binding|INFO|Setting lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 up in Southbound
Nov 28 20:00:34 np0005539279 nova_compute[187514]: 2025-11-29 01:00:34.975 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:34 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:34.981 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[b0086cb9-6af3-4208-be17-d73615efcfc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:34 np0005539279 systemd-udevd[217912]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 20:00:35 np0005539279 NetworkManager[55703]: <info>  [1764378035.0081] device (tap895d0a7c-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 20:00:35 np0005539279 NetworkManager[55703]: <info>  [1764378035.0091] device (tap895d0a7c-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.008 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[d37cdf54-e2bb-4fe8-93f8-9a7b6987cd1f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.048 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[9fb43777-02a8-4220-8080-dec962e81c70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 NetworkManager[55703]: <info>  [1764378035.0563] manager: (tap99d13d3d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.056 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c2ec3649-fdcc-4076-9750-44584e054531]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.095 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf29b6d-5184-4bd6-9d14-cd1aa4eec048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.099 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[efef9d3f-6441-4403-8dc2-d8e61bb62115]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 NetworkManager[55703]: <info>  [1764378035.1296] device (tap99d13d3d-10): carrier: link connected
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.139 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[515c2dcd-574f-42a6-9539-49c98a6f5eac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.166 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[7c6dcaf3-faa9-46f7-a741-b42867eac3fc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99d13d3d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:60:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396883, 'reachable_time': 38101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217945, 'error': None, 'target': 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.197 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[688903e7-44ce-4d80-998b-c04c1acfb6b3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec7:603e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 396883, 'tstamp': 396883}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217950, 'error': None, 'target': 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.222 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[99bd5402-ba08-446a-8eee-122819524fe7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99d13d3d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:60:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396883, 'reachable_time': 38101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217951, 'error': None, 'target': 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.265 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378035.2638242, 8b1a6d47-bfdb-471f-8cde-41890ea21904 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.266 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] VM Started (Lifecycle Event)#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.281 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e4dc6e52-95f5-4387-b866-4471cf1c9bff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.299 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.305 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378035.2642767, 8b1a6d47-bfdb-471f-8cde-41890ea21904 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.306 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] VM Paused (Lifecycle Event)#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.335 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.340 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.372 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0dab0e3d-bf95-40a0-a442-602d012c148a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.374 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99d13d3d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.375 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.375 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99d13d3d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.393 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.420 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:35 np0005539279 NetworkManager[55703]: <info>  [1764378035.4207] manager: (tap99d13d3d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 28 20:00:35 np0005539279 kernel: tap99d13d3d-10: entered promiscuous mode
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.427 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.428 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99d13d3d-10, col_values=(('external_ids', {'iface-id': '1b956cf5-1d04-4721-9b35-ec869c7a032b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.430 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:35 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:35Z|00122|binding|INFO|Releasing lport 1b956cf5-1d04-4721-9b35-ec869c7a032b from this chassis (sb_readonly=0)
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.454 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.456 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/99d13d3d-182f-48e0-a407-8e8368320207.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/99d13d3d-182f-48e0-a407-8e8368320207.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.458 187518 DEBUG nova.network.neutron [req-430a9e79-0feb-451e-ab4f-91ee3315d5a9 req-08a1a818-bf22-4686-8267-f2687924ce0f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Updated VIF entry in instance network info cache for port 895d0a7c-e1eb-4602-8b34-4390b3fca106. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.459 187518 DEBUG nova.network.neutron [req-430a9e79-0feb-451e-ab4f-91ee3315d5a9 req-08a1a818-bf22-4686-8267-f2687924ce0f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Updating instance_info_cache with network_info: [{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.458 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0f03adb9-ee8b-4542-9876-64046dd0307e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.461 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-99d13d3d-182f-48e0-a407-8e8368320207
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/99d13d3d-182f-48e0-a407-8e8368320207.pid.haproxy
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 99d13d3d-182f-48e0-a407-8e8368320207
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 20:00:35 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:35.462 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'env', 'PROCESS_TAG=haproxy-99d13d3d-182f-48e0-a407-8e8368320207', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/99d13d3d-182f-48e0-a407-8e8368320207.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.481 187518 DEBUG oslo_concurrency.lockutils [req-430a9e79-0feb-451e-ab4f-91ee3315d5a9 req-08a1a818-bf22-4686-8267-f2687924ce0f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.568 187518 DEBUG nova.compute.manager [req-7aac0f63-4b00-45a0-b7eb-2615b1440763 req-25e6ce91-6104-438a-b29e-3791fbaa01e5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.568 187518 DEBUG oslo_concurrency.lockutils [req-7aac0f63-4b00-45a0-b7eb-2615b1440763 req-25e6ce91-6104-438a-b29e-3791fbaa01e5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.569 187518 DEBUG oslo_concurrency.lockutils [req-7aac0f63-4b00-45a0-b7eb-2615b1440763 req-25e6ce91-6104-438a-b29e-3791fbaa01e5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.569 187518 DEBUG oslo_concurrency.lockutils [req-7aac0f63-4b00-45a0-b7eb-2615b1440763 req-25e6ce91-6104-438a-b29e-3791fbaa01e5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.570 187518 DEBUG nova.compute.manager [req-7aac0f63-4b00-45a0-b7eb-2615b1440763 req-25e6ce91-6104-438a-b29e-3791fbaa01e5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Processing event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.571 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.579 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378035.579338, 8b1a6d47-bfdb-471f-8cde-41890ea21904 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.580 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] VM Resumed (Lifecycle Event)#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.584 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.591 187518 INFO nova.virt.libvirt.driver [-] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Instance spawned successfully.#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.592 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.611 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.624 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.631 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.632 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.633 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.634 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.635 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.636 187518 DEBUG nova.virt.libvirt.driver [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.660 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.696 187518 INFO nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Took 5.94 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.697 187518 DEBUG nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.773 187518 INFO nova.compute.manager [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Took 6.50 seconds to build instance.#033[00m
Nov 28 20:00:35 np0005539279 nova_compute[187514]: 2025-11-29 01:00:35.793 187518 DEBUG oslo_concurrency.lockutils [None req-0e4808fe-baf5-4ac7-8252-64d37ed2896b 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:35 np0005539279 podman[217982]: 2025-11-29 01:00:35.967807084 +0000 UTC m=+0.077284452 container create d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 20:00:36 np0005539279 podman[217982]: 2025-11-29 01:00:35.920610661 +0000 UTC m=+0.030088079 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 20:00:36 np0005539279 systemd[1]: Started libpod-conmon-d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c.scope.
Nov 28 20:00:36 np0005539279 systemd[1]: Started libcrun container.
Nov 28 20:00:36 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c0b952366b56a71e790fc05e6f250d74c9336ecf6c68869d13ca5f0732179ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 20:00:36 np0005539279 podman[217982]: 2025-11-29 01:00:36.094833877 +0000 UTC m=+0.204311295 container init d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 20:00:36 np0005539279 podman[217982]: 2025-11-29 01:00:36.107441939 +0000 UTC m=+0.216919307 container start d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 20:00:36 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[217997]: [NOTICE]   (218001) : New worker (218003) forked
Nov 28 20:00:36 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[217997]: [NOTICE]   (218001) : Loading success.
Nov 28 20:00:37 np0005539279 nova_compute[187514]: 2025-11-29 01:00:37.563 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:37 np0005539279 nova_compute[187514]: 2025-11-29 01:00:37.675 187518 DEBUG nova.compute.manager [req-a86091dc-723a-45cb-96d8-c48649ca3db0 req-420a9dec-36aa-45fd-af3b-a03f9302564f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:37 np0005539279 nova_compute[187514]: 2025-11-29 01:00:37.677 187518 DEBUG oslo_concurrency.lockutils [req-a86091dc-723a-45cb-96d8-c48649ca3db0 req-420a9dec-36aa-45fd-af3b-a03f9302564f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:37 np0005539279 nova_compute[187514]: 2025-11-29 01:00:37.679 187518 DEBUG oslo_concurrency.lockutils [req-a86091dc-723a-45cb-96d8-c48649ca3db0 req-420a9dec-36aa-45fd-af3b-a03f9302564f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:37 np0005539279 nova_compute[187514]: 2025-11-29 01:00:37.679 187518 DEBUG oslo_concurrency.lockutils [req-a86091dc-723a-45cb-96d8-c48649ca3db0 req-420a9dec-36aa-45fd-af3b-a03f9302564f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:37 np0005539279 nova_compute[187514]: 2025-11-29 01:00:37.680 187518 DEBUG nova.compute.manager [req-a86091dc-723a-45cb-96d8-c48649ca3db0 req-420a9dec-36aa-45fd-af3b-a03f9302564f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] No waiting events found dispatching network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:37 np0005539279 nova_compute[187514]: 2025-11-29 01:00:37.680 187518 WARNING nova.compute.manager [req-a86091dc-723a-45cb-96d8-c48649ca3db0 req-420a9dec-36aa-45fd-af3b-a03f9302564f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received unexpected event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:00:38 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:38Z|00123|binding|INFO|Releasing lport 1b956cf5-1d04-4721-9b35-ec869c7a032b from this chassis (sb_readonly=0)
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.098 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:38 np0005539279 NetworkManager[55703]: <info>  [1764378038.0998] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 28 20:00:38 np0005539279 NetworkManager[55703]: <info>  [1764378038.1015] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 28 20:00:38 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:38Z|00124|binding|INFO|Releasing lport 1b956cf5-1d04-4721-9b35-ec869c7a032b from this chassis (sb_readonly=0)
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.158 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.163 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.729 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.768 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "8b1a6d47-bfdb-471f-8cde-41890ea21904" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.769 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.769 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.769 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.769 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.771 187518 INFO nova.compute.manager [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Terminating instance#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.772 187518 DEBUG nova.compute.manager [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 20:00:38 np0005539279 kernel: tap895d0a7c-e1 (unregistering): left promiscuous mode
Nov 28 20:00:38 np0005539279 NetworkManager[55703]: <info>  [1764378038.8021] device (tap895d0a7c-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:00:38 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:38Z|00125|binding|INFO|Releasing lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 from this chassis (sb_readonly=0)
Nov 28 20:00:38 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:38Z|00126|binding|INFO|Setting lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 down in Southbound
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.816 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:38 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:38Z|00127|binding|INFO|Removing iface tap895d0a7c-e1 ovn-installed in OVS
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.820 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:38.827 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:52:0c 10.100.0.11'], port_security=['fa:16:3e:10:52:0c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-919928194', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '8b1a6d47-bfdb-471f-8cde-41890ea21904', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99d13d3d-182f-48e0-a407-8e8368320207', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-919928194', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9bd29a3-ae46-41d8-aaea-3325e1bc2031', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a645aae-dbb9-4dd0-9a66-afe665407f03, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=895d0a7c-e1eb-4602-8b34-4390b3fca106) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:00:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:38.829 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 895d0a7c-e1eb-4602-8b34-4390b3fca106 in datapath 99d13d3d-182f-48e0-a407-8e8368320207 unbound from our chassis#033[00m
Nov 28 20:00:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:38.831 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 99d13d3d-182f-48e0-a407-8e8368320207, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 20:00:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:38.833 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[f3a407f8-8ed8-4aa3-9abb-13d90b46e1a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:38 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:38.834 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 namespace which is not needed anymore#033[00m
Nov 28 20:00:38 np0005539279 nova_compute[187514]: 2025-11-29 01:00:38.848 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:38 np0005539279 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 28 20:00:38 np0005539279 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 3.492s CPU time.
Nov 28 20:00:38 np0005539279 systemd-machined[153752]: Machine qemu-8-instance-00000008 terminated.
Nov 28 20:00:39 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[217997]: [NOTICE]   (218001) : haproxy version is 2.8.14-c23fe91
Nov 28 20:00:39 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[217997]: [NOTICE]   (218001) : path to executable is /usr/sbin/haproxy
Nov 28 20:00:39 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[217997]: [WARNING]  (218001) : Exiting Master process...
Nov 28 20:00:39 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[217997]: [ALERT]    (218001) : Current worker (218003) exited with code 143 (Terminated)
Nov 28 20:00:39 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[217997]: [WARNING]  (218001) : All workers exited. Exiting... (0)
Nov 28 20:00:39 np0005539279 systemd[1]: libpod-d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c.scope: Deactivated successfully.
Nov 28 20:00:39 np0005539279 podman[218037]: 2025-11-29 01:00:39.032835114 +0000 UTC m=+0.072066680 container died d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.055 187518 INFO nova.virt.libvirt.driver [-] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Instance destroyed successfully.#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.056 187518 DEBUG nova.objects.instance [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid 8b1a6d47-bfdb-471f-8cde-41890ea21904 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:39 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c-userdata-shm.mount: Deactivated successfully.
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.075 187518 DEBUG nova.virt.libvirt.vif [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T01:00:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1875875969',display_name='tempest-TestNetworkBasicOps-server-1875875969',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1875875969',id=8,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPWO8BIvPMdMW5mPpvy28duxybDjRtZs9spq/TXFqpqIPX+ORupsZKlEA+FcoqTq9lUg4Coh6oFUnatUrdOZTY3dueNHRFB+Sk2YjeUMQCjsIz5vAE8TA3pDtRzbrcbFUg==',key_name='tempest-TestNetworkBasicOps-340010307',keypairs=<?>,launch_index=0,launched_at=2025-11-29T01:00:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-7y7vsnpg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T01:00:35Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=8b1a6d47-bfdb-471f-8cde-41890ea21904,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.076 187518 DEBUG nova.network.os_vif_util [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.077 187518 DEBUG nova.network.os_vif_util [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.078 187518 DEBUG os_vif [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:00:39 np0005539279 systemd[1]: var-lib-containers-storage-overlay-3c0b952366b56a71e790fc05e6f250d74c9336ecf6c68869d13ca5f0732179ef-merged.mount: Deactivated successfully.
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.080 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.081 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap895d0a7c-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:39 np0005539279 podman[218037]: 2025-11-29 01:00:39.084174689 +0000 UTC m=+0.123406265 container cleanup d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.133 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.135 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.138 187518 INFO os_vif [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1')#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.138 187518 INFO nova.virt.libvirt.driver [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Deleting instance files /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904_del#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.139 187518 INFO nova.virt.libvirt.driver [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Deletion of /var/lib/nova/instances/8b1a6d47-bfdb-471f-8cde-41890ea21904_del complete#033[00m
Nov 28 20:00:39 np0005539279 systemd[1]: libpod-conmon-d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c.scope: Deactivated successfully.
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.188 187518 INFO nova.compute.manager [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.189 187518 DEBUG oslo.service.loopingcall [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.190 187518 DEBUG nova.compute.manager [-] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.190 187518 DEBUG nova.network.neutron [-] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 20:00:39 np0005539279 podman[218083]: 2025-11-29 01:00:39.204940201 +0000 UTC m=+0.047690987 container remove d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.212 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[43f9457a-001c-450b-8ef3-80bbeb69184a]: (4, ('Sat Nov 29 01:00:38 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 (d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c)\nd25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c\nSat Nov 29 01:00:39 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 (d25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c)\nd25e6208bc700572a9c0429b07fb562f0f780340c1e50d87032bf99f583ec26c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.214 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[58ef30a1-bfbb-4303-8593-0c746cb94f60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.216 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99d13d3d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.218 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:39 np0005539279 kernel: tap99d13d3d-10: left promiscuous mode
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.221 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.224 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[77d4c53d-843d-4b60-a804-469caa9343e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.233 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.244 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b9430edd-eb06-4db7-a92a-1e03499dabec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.245 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a7cfca-b658-4a61-9415-fefa0dcd7d7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.267 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[f80ad30c-f37c-45f9-863a-c32dbb6beb9f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 396874, 'reachable_time': 29311, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218097, 'error': None, 'target': 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.272 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 20:00:39 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:39.272 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[11fdcecd-a4be-4c78-b1d3-833425c75061]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:39 np0005539279 systemd[1]: run-netns-ovnmeta\x2d99d13d3d\x2d182f\x2d48e0\x2da407\x2d8e8368320207.mount: Deactivated successfully.
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.774 187518 DEBUG nova.compute.manager [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received event network-changed-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.775 187518 DEBUG nova.compute.manager [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Refreshing instance network info cache due to event network-changed-895d0a7c-e1eb-4602-8b34-4390b3fca106. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.775 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.775 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:00:39 np0005539279 nova_compute[187514]: 2025-11-29 01:00:39.776 187518 DEBUG nova.network.neutron [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Refreshing network info cache for port 895d0a7c-e1eb-4602-8b34-4390b3fca106 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.427 187518 DEBUG nova.network.neutron [-] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.444 187518 INFO nova.compute.manager [-] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Took 1.25 seconds to deallocate network for instance.#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.485 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.486 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.548 187518 DEBUG nova.compute.provider_tree [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.562 187518 DEBUG nova.scheduler.client.report [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.582 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.617 187518 INFO nova.scheduler.client.report [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance 8b1a6d47-bfdb-471f-8cde-41890ea21904#033[00m
Nov 28 20:00:40 np0005539279 nova_compute[187514]: 2025-11-29 01:00:40.694 187518 DEBUG oslo_concurrency.lockutils [None req-0903c9ac-5ebb-4d3c-98bf-508af1ba4394 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.925s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.085 187518 DEBUG nova.network.neutron [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Updated VIF entry in instance network info cache for port 895d0a7c-e1eb-4602-8b34-4390b3fca106. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.087 187518 DEBUG nova.network.neutron [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Updating instance_info_cache with network_info: [{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.106 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-8b1a6d47-bfdb-471f-8cde-41890ea21904" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.106 187518 DEBUG nova.compute.manager [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received event network-vif-unplugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.107 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.108 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.108 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.108 187518 DEBUG nova.compute.manager [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] No waiting events found dispatching network-vif-unplugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.109 187518 DEBUG nova.compute.manager [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received event network-vif-unplugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.109 187518 DEBUG nova.compute.manager [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.110 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.110 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.110 187518 DEBUG oslo_concurrency.lockutils [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "8b1a6d47-bfdb-471f-8cde-41890ea21904-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.111 187518 DEBUG nova.compute.manager [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] No waiting events found dispatching network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:41 np0005539279 nova_compute[187514]: 2025-11-29 01:00:41.111 187518 WARNING nova.compute.manager [req-f8478cf8-0630-4290-929d-9e4fbb2b6d37 req-0e2e4a15-8a55-4446-b9da-fe5d07e46d4f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Received unexpected event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 20:00:42 np0005539279 nova_compute[187514]: 2025-11-29 01:00:42.565 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:44 np0005539279 nova_compute[187514]: 2025-11-29 01:00:44.134 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:46 np0005539279 podman[218099]: 2025-11-29 01:00:46.863376165 +0000 UTC m=+0.092021432 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:00:46 np0005539279 podman[218098]: 2025-11-29 01:00:46.86906859 +0000 UTC m=+0.098498089 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 28 20:00:46 np0005539279 podman[218100]: 2025-11-29 01:00:46.902541299 +0000 UTC m=+0.126503259 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:00:47 np0005539279 nova_compute[187514]: 2025-11-29 01:00:47.567 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:49 np0005539279 nova_compute[187514]: 2025-11-29 01:00:49.136 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.243 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "5b12c763-5c01-4955-847a-ffa225d6246a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.244 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.262 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.361 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.362 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.372 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.373 187518 INFO nova.compute.claims [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.566 187518 DEBUG nova.compute.provider_tree [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.582 187518 DEBUG nova.scheduler.client.report [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.610 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.610 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.661 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.661 187518 DEBUG nova.network.neutron [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.695 187518 INFO nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.718 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.823 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.825 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.826 187518 INFO nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Creating image(s)#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.827 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.828 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.830 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.855 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.947 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.950 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.952 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:50 np0005539279 nova_compute[187514]: 2025-11-29 01:00:50.980 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.064 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.066 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.122 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.124 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.125 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.217 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.218 187518 DEBUG nova.virt.disk.api [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.219 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.306 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.308 187518 DEBUG nova.virt.disk.api [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.309 187518 DEBUG nova.objects.instance [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 5b12c763-5c01-4955-847a-ffa225d6246a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.338 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.339 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Ensure instance console log exists: /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.340 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.341 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.341 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:51 np0005539279 nova_compute[187514]: 2025-11-29 01:00:51.642 187518 DEBUG nova.policy [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 20:00:52 np0005539279 nova_compute[187514]: 2025-11-29 01:00:52.571 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:53 np0005539279 nova_compute[187514]: 2025-11-29 01:00:53.757 187518 DEBUG nova.network.neutron [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Successfully updated port: 895d0a7c-e1eb-4602-8b34-4390b3fca106 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 20:00:53 np0005539279 nova_compute[187514]: 2025-11-29 01:00:53.772 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-5b12c763-5c01-4955-847a-ffa225d6246a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:00:53 np0005539279 nova_compute[187514]: 2025-11-29 01:00:53.772 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-5b12c763-5c01-4955-847a-ffa225d6246a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:00:53 np0005539279 nova_compute[187514]: 2025-11-29 01:00:53.772 187518 DEBUG nova.network.neutron [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 20:00:53 np0005539279 nova_compute[187514]: 2025-11-29 01:00:53.898 187518 DEBUG nova.compute.manager [req-8dfae085-5ade-455d-880d-7c71e40b2489 req-86b4a371-abd9-43a0-a45f-cae2122abac6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Received event network-changed-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:53 np0005539279 nova_compute[187514]: 2025-11-29 01:00:53.898 187518 DEBUG nova.compute.manager [req-8dfae085-5ade-455d-880d-7c71e40b2489 req-86b4a371-abd9-43a0-a45f-cae2122abac6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Refreshing instance network info cache due to event network-changed-895d0a7c-e1eb-4602-8b34-4390b3fca106. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:00:53 np0005539279 nova_compute[187514]: 2025-11-29 01:00:53.898 187518 DEBUG oslo_concurrency.lockutils [req-8dfae085-5ade-455d-880d-7c71e40b2489 req-86b4a371-abd9-43a0-a45f-cae2122abac6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-5b12c763-5c01-4955-847a-ffa225d6246a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:00:54 np0005539279 nova_compute[187514]: 2025-11-29 01:00:54.053 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764378039.0521593, 8b1a6d47-bfdb-471f-8cde-41890ea21904 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:54 np0005539279 nova_compute[187514]: 2025-11-29 01:00:54.054 187518 INFO nova.compute.manager [-] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] VM Stopped (Lifecycle Event)#033[00m
Nov 28 20:00:54 np0005539279 nova_compute[187514]: 2025-11-29 01:00:54.083 187518 DEBUG nova.compute.manager [None req-cba56158-ad78-4ff0-b13a-dd11bd0f5386 - - - - - -] [instance: 8b1a6d47-bfdb-471f-8cde-41890ea21904] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:54 np0005539279 nova_compute[187514]: 2025-11-29 01:00:54.174 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:54 np0005539279 nova_compute[187514]: 2025-11-29 01:00:54.403 187518 DEBUG nova.network.neutron [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.774 187518 DEBUG nova.network.neutron [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Updating instance_info_cache with network_info: [{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.811 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-5b12c763-5c01-4955-847a-ffa225d6246a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.812 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Instance network_info: |[{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.813 187518 DEBUG oslo_concurrency.lockutils [req-8dfae085-5ade-455d-880d-7c71e40b2489 req-86b4a371-abd9-43a0-a45f-cae2122abac6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-5b12c763-5c01-4955-847a-ffa225d6246a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.813 187518 DEBUG nova.network.neutron [req-8dfae085-5ade-455d-880d-7c71e40b2489 req-86b4a371-abd9-43a0-a45f-cae2122abac6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Refreshing network info cache for port 895d0a7c-e1eb-4602-8b34-4390b3fca106 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.816 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Start _get_guest_xml network_info=[{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.822 187518 WARNING nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.829 187518 DEBUG nova.virt.libvirt.host [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.830 187518 DEBUG nova.virt.libvirt.host [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.834 187518 DEBUG nova.virt.libvirt.host [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.835 187518 DEBUG nova.virt.libvirt.host [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.836 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.836 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.837 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.837 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.837 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.837 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.838 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.838 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.838 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.839 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.839 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.839 187518 DEBUG nova.virt.hardware [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.844 187518 DEBUG nova.virt.libvirt.vif [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:00:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-519863814',display_name='tempest-TestNetworkBasicOps-server-519863814',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-519863814',id=9,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODFkkXIi+A2+yDfMj1ERiyGon1xouxU9jBHHy09iJJ26MaPMi/0/GDhk/DvSxxIDN/o6Bnw2D1Nv8t7Gl6w19XYmDoSLhUzEh7iy+ZDFbXgahNEMDXT58O+c2kufZU0lQ==',key_name='tempest-TestNetworkBasicOps-1875615411',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-6wmwsw4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:00:50Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=5b12c763-5c01-4955-847a-ffa225d6246a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.844 187518 DEBUG nova.network.os_vif_util [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.845 187518 DEBUG nova.network.os_vif_util [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.846 187518 DEBUG nova.objects.instance [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5b12c763-5c01-4955-847a-ffa225d6246a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.872 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] End _get_guest_xml xml=<domain type="kvm">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <uuid>5b12c763-5c01-4955-847a-ffa225d6246a</uuid>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <name>instance-00000009</name>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-519863814</nova:name>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 01:00:55</nova:creationTime>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        <nova:port uuid="895d0a7c-e1eb-4602-8b34-4390b3fca106">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <entry name="serial">5b12c763-5c01-4955-847a-ffa225d6246a</entry>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <entry name="uuid">5b12c763-5c01-4955-847a-ffa225d6246a</entry>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk.config"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:10:52:0c"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <target dev="tap895d0a7c-e1"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/console.log" append="off"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 20:00:55 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:00:55 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:00:55 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:00:55 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.874 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Preparing to wait for external event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.875 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.875 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.875 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.877 187518 DEBUG nova.virt.libvirt.vif [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:00:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-519863814',display_name='tempest-TestNetworkBasicOps-server-519863814',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-519863814',id=9,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODFkkXIi+A2+yDfMj1ERiyGon1xouxU9jBHHy09iJJ26MaPMi/0/GDhk/DvSxxIDN/o6Bnw2D1Nv8t7Gl6w19XYmDoSLhUzEh7iy+ZDFbXgahNEMDXT58O+c2kufZU0lQ==',key_name='tempest-TestNetworkBasicOps-1875615411',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-6wmwsw4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:00:50Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=5b12c763-5c01-4955-847a-ffa225d6246a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.877 187518 DEBUG nova.network.os_vif_util [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.878 187518 DEBUG nova.network.os_vif_util [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.879 187518 DEBUG os_vif [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.880 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.886 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.890 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.902 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.903 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap895d0a7c-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.903 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap895d0a7c-e1, col_values=(('external_ids', {'iface-id': '895d0a7c-e1eb-4602-8b34-4390b3fca106', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:10:52:0c', 'vm-uuid': '5b12c763-5c01-4955-847a-ffa225d6246a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.905 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:55 np0005539279 NetworkManager[55703]: <info>  [1764378055.9072] manager: (tap895d0a7c-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.907 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.913 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.914 187518 INFO os_vif [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1')#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.980 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.981 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.981 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:10:52:0c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 20:00:55 np0005539279 nova_compute[187514]: 2025-11-29 01:00:55.982 187518 INFO nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Using config drive#033[00m
Nov 28 20:00:56 np0005539279 nova_compute[187514]: 2025-11-29 01:00:56.643 187518 INFO nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Creating config drive at /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk.config#033[00m
Nov 28 20:00:56 np0005539279 nova_compute[187514]: 2025-11-29 01:00:56.652 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjtsdb8b7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:00:56 np0005539279 nova_compute[187514]: 2025-11-29 01:00:56.797 187518 DEBUG oslo_concurrency.processutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjtsdb8b7" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:00:56 np0005539279 kernel: tap895d0a7c-e1: entered promiscuous mode
Nov 28 20:00:56 np0005539279 NetworkManager[55703]: <info>  [1764378056.8885] manager: (tap895d0a7c-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Nov 28 20:00:56 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:56Z|00128|binding|INFO|Claiming lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 for this chassis.
Nov 28 20:00:56 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:56Z|00129|binding|INFO|895d0a7c-e1eb-4602-8b34-4390b3fca106: Claiming fa:16:3e:10:52:0c 10.100.0.11
Nov 28 20:00:56 np0005539279 nova_compute[187514]: 2025-11-29 01:00:56.914 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.925 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:52:0c 10.100.0.11'], port_security=['fa:16:3e:10:52:0c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-919928194', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5b12c763-5c01-4955-847a-ffa225d6246a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99d13d3d-182f-48e0-a407-8e8368320207', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-919928194', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'f9bd29a3-ae46-41d8-aaea-3325e1bc2031', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a645aae-dbb9-4dd0-9a66-afe665407f03, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=895d0a7c-e1eb-4602-8b34-4390b3fca106) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.926 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 895d0a7c-e1eb-4602-8b34-4390b3fca106 in datapath 99d13d3d-182f-48e0-a407-8e8368320207 bound to our chassis#033[00m
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.927 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99d13d3d-182f-48e0-a407-8e8368320207#033[00m
Nov 28 20:00:56 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:56Z|00130|binding|INFO|Setting lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 ovn-installed in OVS
Nov 28 20:00:56 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:56Z|00131|binding|INFO|Setting lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 up in Southbound
Nov 28 20:00:56 np0005539279 nova_compute[187514]: 2025-11-29 01:00:56.936 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:56 np0005539279 nova_compute[187514]: 2025-11-29 01:00:56.940 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.946 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[75c84985-e520-493b-97ff-46aae59a91ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.947 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap99d13d3d-11 in ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.949 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap99d13d3d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.949 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[909ecb17-ce59-404c-946e-38b09656e5fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:56 np0005539279 systemd-machined[153752]: New machine qemu-9-instance-00000009.
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.950 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[61170a6f-82fd-4c67-85c7-891fb24cd946]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:56 np0005539279 systemd-udevd[218204]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 20:00:56 np0005539279 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.969 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[d1fda671-3791-4767-b64a-8a508e47fae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:56 np0005539279 NetworkManager[55703]: <info>  [1764378056.9725] device (tap895d0a7c-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 20:00:56 np0005539279 NetworkManager[55703]: <info>  [1764378056.9737] device (tap895d0a7c-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 20:00:56 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:56.990 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[8186e5d8-24fc-438f-8dd2-17891a0de26a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.027 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[e11a9f69-6ba2-492d-815a-70b15cdeea65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.035 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0605e4c4-e69f-4afb-ab9e-277bf2a72add]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 NetworkManager[55703]: <info>  [1764378057.0370] manager: (tap99d13d3d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.078 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[2e20011a-df3d-4c9c-b58a-55b0d9274a71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.084 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[b3543534-3b94-473f-a120-05e22071c20d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 NetworkManager[55703]: <info>  [1764378057.1173] device (tap99d13d3d-10): carrier: link connected
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.123 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[f22a20af-a8eb-42ce-8619-f9fb66a3f41c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.154 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[64519ffb-2a81-45c0-ad1f-81634a81a32d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99d13d3d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:60:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399082, 'reachable_time': 18711, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218238, 'error': None, 'target': 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.180 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b16684f3-79a3-4827-8370-cf62800e15c5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec7:603e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399082, 'tstamp': 399082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218239, 'error': None, 'target': 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.211 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[1e4712f3-6ec0-4568-be29-05609ad56b2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99d13d3d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c7:60:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399082, 'reachable_time': 18711, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218240, 'error': None, 'target': 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.263 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[99318904-6739-4cb5-9946-2778ff5e3f1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.372 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378057.3705697, 5b12c763-5c01-4955-847a-ffa225d6246a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.373 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] VM Started (Lifecycle Event)#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.373 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[01862b86-56ad-4ad9-b6a8-c56a2f057b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.375 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99d13d3d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.375 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.376 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99d13d3d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.378 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:57 np0005539279 NetworkManager[55703]: <info>  [1764378057.3796] manager: (tap99d13d3d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 28 20:00:57 np0005539279 kernel: tap99d13d3d-10: entered promiscuous mode
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.381 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.383 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99d13d3d-10, col_values=(('external_ids', {'iface-id': '1b956cf5-1d04-4721-9b35-ec869c7a032b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.384 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:57 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:57Z|00132|binding|INFO|Releasing lport 1b956cf5-1d04-4721-9b35-ec869c7a032b from this chassis (sb_readonly=0)
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.401 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.406 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378057.3719344, 5b12c763-5c01-4955-847a-ffa225d6246a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.406 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] VM Paused (Lifecycle Event)#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.411 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.413 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/99d13d3d-182f-48e0-a407-8e8368320207.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/99d13d3d-182f-48e0-a407-8e8368320207.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.414 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[1d936e5a-d2f4-4ab8-91bb-f2703b6abfb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.416 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-99d13d3d-182f-48e0-a407-8e8368320207
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/99d13d3d-182f-48e0-a407-8e8368320207.pid.haproxy
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 99d13d3d-182f-48e0-a407-8e8368320207
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 20:00:57 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:57.417 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'env', 'PROCESS_TAG=haproxy-99d13d3d-182f-48e0-a407-8e8368320207', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/99d13d3d-182f-48e0-a407-8e8368320207.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.429 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.436 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.469 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.574 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.643 187518 DEBUG nova.compute.manager [req-7b455f20-656f-44ee-9cb6-8141152a5880 req-dd1d55f9-a69b-4147-b5e9-bb1dccd04b3e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Received event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.644 187518 DEBUG oslo_concurrency.lockutils [req-7b455f20-656f-44ee-9cb6-8141152a5880 req-dd1d55f9-a69b-4147-b5e9-bb1dccd04b3e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.645 187518 DEBUG oslo_concurrency.lockutils [req-7b455f20-656f-44ee-9cb6-8141152a5880 req-dd1d55f9-a69b-4147-b5e9-bb1dccd04b3e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.646 187518 DEBUG oslo_concurrency.lockutils [req-7b455f20-656f-44ee-9cb6-8141152a5880 req-dd1d55f9-a69b-4147-b5e9-bb1dccd04b3e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.647 187518 DEBUG nova.compute.manager [req-7b455f20-656f-44ee-9cb6-8141152a5880 req-dd1d55f9-a69b-4147-b5e9-bb1dccd04b3e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Processing event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.648 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.655 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.664 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378057.66427, 5b12c763-5c01-4955-847a-ffa225d6246a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.665 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] VM Resumed (Lifecycle Event)#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.669 187518 INFO nova.virt.libvirt.driver [-] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Instance spawned successfully.#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.670 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.686 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.695 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.703 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.704 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.704 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.705 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.705 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.706 187518 DEBUG nova.virt.libvirt.driver [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.713 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.766 187518 INFO nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Took 6.94 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.768 187518 DEBUG nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.845 187518 INFO nova.compute.manager [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Took 7.53 seconds to build instance.#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.862 187518 DEBUG oslo_concurrency.lockutils [None req-4f7f9402-da25-40d7-8475-92cddbbca92a 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:57 np0005539279 podman[218279]: 2025-11-29 01:00:57.898863449 +0000 UTC m=+0.079434389 container create 9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.952 187518 DEBUG nova.network.neutron [req-8dfae085-5ade-455d-880d-7c71e40b2489 req-86b4a371-abd9-43a0-a45f-cae2122abac6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Updated VIF entry in instance network info cache for port 895d0a7c-e1eb-4602-8b34-4390b3fca106. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.953 187518 DEBUG nova.network.neutron [req-8dfae085-5ade-455d-880d-7c71e40b2489 req-86b4a371-abd9-43a0-a45f-cae2122abac6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Updating instance_info_cache with network_info: [{"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:00:57 np0005539279 podman[218279]: 2025-11-29 01:00:57.85864215 +0000 UTC m=+0.039213100 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 20:00:57 np0005539279 systemd[1]: Started libpod-conmon-9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932.scope.
Nov 28 20:00:57 np0005539279 nova_compute[187514]: 2025-11-29 01:00:57.972 187518 DEBUG oslo_concurrency.lockutils [req-8dfae085-5ade-455d-880d-7c71e40b2489 req-86b4a371-abd9-43a0-a45f-cae2122abac6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-5b12c763-5c01-4955-847a-ffa225d6246a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:00:57 np0005539279 systemd[1]: Started libcrun container.
Nov 28 20:00:58 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7dca0874d55bdb38cafa9d44ff3dbefc954a52599a7f5f57475de269555c6cc0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 20:00:58 np0005539279 podman[218279]: 2025-11-29 01:00:58.028282838 +0000 UTC m=+0.208853798 container init 9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:00:58 np0005539279 podman[218279]: 2025-11-29 01:00:58.034774357 +0000 UTC m=+0.215345297 container start 9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:00:58 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[218294]: [NOTICE]   (218298) : New worker (218300) forked
Nov 28 20:00:58 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[218294]: [NOTICE]   (218298) : Loading success.
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.599 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "5b12c763-5c01-4955-847a-ffa225d6246a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.601 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.601 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.602 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.602 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.604 187518 INFO nova.compute.manager [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Terminating instance#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.606 187518 DEBUG nova.compute.manager [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 20:00:59 np0005539279 kernel: tap895d0a7c-e1 (unregistering): left promiscuous mode
Nov 28 20:00:59 np0005539279 NetworkManager[55703]: <info>  [1764378059.6421] device (tap895d0a7c-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:00:59 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:59Z|00133|binding|INFO|Releasing lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 from this chassis (sb_readonly=0)
Nov 28 20:00:59 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:59Z|00134|binding|INFO|Setting lport 895d0a7c-e1eb-4602-8b34-4390b3fca106 down in Southbound
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.653 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:59 np0005539279 ovn_controller[95686]: 2025-11-29T01:00:59Z|00135|binding|INFO|Removing iface tap895d0a7c-e1 ovn-installed in OVS
Nov 28 20:00:59 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:59.668 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:10:52:0c 10.100.0.11'], port_security=['fa:16:3e:10:52:0c 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-919928194', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5b12c763-5c01-4955-847a-ffa225d6246a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99d13d3d-182f-48e0-a407-8e8368320207', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-919928194', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'f9bd29a3-ae46-41d8-aaea-3325e1bc2031', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.229', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a645aae-dbb9-4dd0-9a66-afe665407f03, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=895d0a7c-e1eb-4602-8b34-4390b3fca106) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:00:59 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:59.669 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 895d0a7c-e1eb-4602-8b34-4390b3fca106 in datapath 99d13d3d-182f-48e0-a407-8e8368320207 unbound from our chassis#033[00m
Nov 28 20:00:59 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:59.671 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 99d13d3d-182f-48e0-a407-8e8368320207, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 20:00:59 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:59.672 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[78bb2e88-9f89-4901-8612-7646f8525585]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:59 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:59.673 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 namespace which is not needed anymore#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.691 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:59 np0005539279 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 28 20:00:59 np0005539279 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 2.265s CPU time.
Nov 28 20:00:59 np0005539279 systemd-machined[153752]: Machine qemu-9-instance-00000009 terminated.
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.756 187518 DEBUG nova.compute.manager [req-0203a9f1-557b-4f2f-b461-280f3ab46cd2 req-b29f50ab-e182-4dd9-b254-38f7905ee0a3 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Received event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.756 187518 DEBUG oslo_concurrency.lockutils [req-0203a9f1-557b-4f2f-b461-280f3ab46cd2 req-b29f50ab-e182-4dd9-b254-38f7905ee0a3 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.756 187518 DEBUG oslo_concurrency.lockutils [req-0203a9f1-557b-4f2f-b461-280f3ab46cd2 req-b29f50ab-e182-4dd9-b254-38f7905ee0a3 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.756 187518 DEBUG oslo_concurrency.lockutils [req-0203a9f1-557b-4f2f-b461-280f3ab46cd2 req-b29f50ab-e182-4dd9-b254-38f7905ee0a3 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.757 187518 DEBUG nova.compute.manager [req-0203a9f1-557b-4f2f-b461-280f3ab46cd2 req-b29f50ab-e182-4dd9-b254-38f7905ee0a3 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] No waiting events found dispatching network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.757 187518 WARNING nova.compute.manager [req-0203a9f1-557b-4f2f-b461-280f3ab46cd2 req-b29f50ab-e182-4dd9-b254-38f7905ee0a3 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Received unexpected event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 20:00:59 np0005539279 podman[218309]: 2025-11-29 01:00:59.768308768 +0000 UTC m=+0.092302163 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64)
Nov 28 20:00:59 np0005539279 podman[218314]: 2025-11-29 01:00:59.800359439 +0000 UTC m=+0.113969502 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 20:00:59 np0005539279 NetworkManager[55703]: <info>  [1764378059.8335] manager: (tap895d0a7c-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Nov 28 20:00:59 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[218294]: [NOTICE]   (218298) : haproxy version is 2.8.14-c23fe91
Nov 28 20:00:59 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[218294]: [NOTICE]   (218298) : path to executable is /usr/sbin/haproxy
Nov 28 20:00:59 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[218294]: [WARNING]  (218298) : Exiting Master process...
Nov 28 20:00:59 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[218294]: [ALERT]    (218298) : Current worker (218300) exited with code 143 (Terminated)
Nov 28 20:00:59 np0005539279 neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207[218294]: [WARNING]  (218298) : All workers exited. Exiting... (0)
Nov 28 20:00:59 np0005539279 systemd[1]: libpod-9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932.scope: Deactivated successfully.
Nov 28 20:00:59 np0005539279 podman[218374]: 2025-11-29 01:00:59.851792703 +0000 UTC m=+0.056840692 container died 9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 20:00:59 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932-userdata-shm.mount: Deactivated successfully.
Nov 28 20:00:59 np0005539279 systemd[1]: var-lib-containers-storage-overlay-7dca0874d55bdb38cafa9d44ff3dbefc954a52599a7f5f57475de269555c6cc0-merged.mount: Deactivated successfully.
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.897 187518 INFO nova.virt.libvirt.driver [-] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Instance destroyed successfully.#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.898 187518 DEBUG nova.objects.instance [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid 5b12c763-5c01-4955-847a-ffa225d6246a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:00:59 np0005539279 podman[218374]: 2025-11-29 01:00:59.905994527 +0000 UTC m=+0.111042506 container cleanup 9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 20:00:59 np0005539279 systemd[1]: libpod-conmon-9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932.scope: Deactivated successfully.
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.915 187518 DEBUG nova.virt.libvirt.vif [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T01:00:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-519863814',display_name='tempest-TestNetworkBasicOps-server-519863814',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-519863814',id=9,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODFkkXIi+A2+yDfMj1ERiyGon1xouxU9jBHHy09iJJ26MaPMi/0/GDhk/DvSxxIDN/o6Bnw2D1Nv8t7Gl6w19XYmDoSLhUzEh7iy+ZDFbXgahNEMDXT58O+c2kufZU0lQ==',key_name='tempest-TestNetworkBasicOps-1875615411',keypairs=<?>,launch_index=0,launched_at=2025-11-29T01:00:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-6wmwsw4h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T01:00:57Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=5b12c763-5c01-4955-847a-ffa225d6246a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.916 187518 DEBUG nova.network.os_vif_util [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "address": "fa:16:3e:10:52:0c", "network": {"id": "99d13d3d-182f-48e0-a407-8e8368320207", "bridge": "br-int", "label": "tempest-network-smoke--2030326662", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap895d0a7c-e1", "ovs_interfaceid": "895d0a7c-e1eb-4602-8b34-4390b3fca106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.916 187518 DEBUG nova.network.os_vif_util [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.917 187518 DEBUG os_vif [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.919 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.920 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap895d0a7c-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.926 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.930 187518 INFO os_vif [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:10:52:0c,bridge_name='br-int',has_traffic_filtering=True,id=895d0a7c-e1eb-4602-8b34-4390b3fca106,network=Network(99d13d3d-182f-48e0-a407-8e8368320207),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap895d0a7c-e1')#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.931 187518 INFO nova.virt.libvirt.driver [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Deleting instance files /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a_del#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.932 187518 INFO nova.virt.libvirt.driver [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Deletion of /var/lib/nova/instances/5b12c763-5c01-4955-847a-ffa225d6246a_del complete#033[00m
Nov 28 20:00:59 np0005539279 podman[218420]: 2025-11-29 01:00:59.987130664 +0000 UTC m=+0.055011759 container remove 9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.992 187518 INFO nova.compute.manager [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.993 187518 DEBUG oslo.service.loopingcall [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.993 187518 DEBUG nova.compute.manager [-] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 20:00:59 np0005539279 nova_compute[187514]: 2025-11-29 01:00:59.993 187518 DEBUG nova.network.neutron [-] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 20:00:59 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:59.993 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[94dad02f-cc13-4ce3-93e1-52d9758ebc73]: (4, ('Sat Nov 29 01:00:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 (9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932)\n9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932\nSat Nov 29 01:00:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 (9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932)\n9bc788ffe119114949650fc6aaf3bf640722f0f90e7755e24532466c6f173932\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:59 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:59.996 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c70f1f30-9272-4937-a4b6-44251b8e9254]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:00:59 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:00:59.997 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99d13d3d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:01:00 np0005539279 nova_compute[187514]: 2025-11-29 01:01:00.000 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:00 np0005539279 kernel: tap99d13d3d-10: left promiscuous mode
Nov 28 20:01:00 np0005539279 nova_compute[187514]: 2025-11-29 01:01:00.002 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:00.008 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[cf68be14-a818-49d0-bef3-f062e035bc28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:00 np0005539279 nova_compute[187514]: 2025-11-29 01:01:00.014 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:00.020 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[a9caed7a-64e8-49e3-a0f0-5657ba04d6aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:00.022 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[eee7f338-d690-4935-9b3e-da2d8013b004]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:00.044 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3f5766-0d4b-4bad-8fbe-9ebae8f34461]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399072, 'reachable_time': 38898, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218436, 'error': None, 'target': 'ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:00.046 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-99d13d3d-182f-48e0-a407-8e8368320207 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 20:01:00 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:00.047 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[ec20d490-6d85-4850-a246-54e0b3792471]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:00 np0005539279 systemd[1]: run-netns-ovnmeta\x2d99d13d3d\x2d182f\x2d48e0\x2da407\x2d8e8368320207.mount: Deactivated successfully.
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.866 187518 DEBUG nova.compute.manager [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Received event network-vif-unplugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.868 187518 DEBUG oslo_concurrency.lockutils [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.868 187518 DEBUG oslo_concurrency.lockutils [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.869 187518 DEBUG oslo_concurrency.lockutils [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.869 187518 DEBUG nova.compute.manager [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] No waiting events found dispatching network-vif-unplugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.869 187518 DEBUG nova.compute.manager [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Received event network-vif-unplugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.869 187518 DEBUG nova.compute.manager [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Received event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.870 187518 DEBUG oslo_concurrency.lockutils [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.870 187518 DEBUG oslo_concurrency.lockutils [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.870 187518 DEBUG oslo_concurrency.lockutils [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.870 187518 DEBUG nova.compute.manager [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] No waiting events found dispatching network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:01:01 np0005539279 nova_compute[187514]: 2025-11-29 01:01:01.870 187518 WARNING nova.compute.manager [req-99689eef-388e-4827-96ec-0b5c1d1c268e req-189b8f81-d7fd-480f-8a6e-0651b024a89f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Received unexpected event network-vif-plugged-895d0a7c-e1eb-4602-8b34-4390b3fca106 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.238 187518 DEBUG nova.network.neutron [-] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.262 187518 INFO nova.compute.manager [-] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Took 2.27 seconds to deallocate network for instance.#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.316 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.316 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.393 187518 DEBUG nova.compute.provider_tree [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.416 187518 DEBUG nova.scheduler.client.report [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.445 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.482 187518 INFO nova.scheduler.client.report [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance 5b12c763-5c01-4955-847a-ffa225d6246a#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.560 187518 DEBUG oslo_concurrency.lockutils [None req-a3b89e61-2898-4535-984a-3498ca4914eb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "5b12c763-5c01-4955-847a-ffa225d6246a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.960s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:02 np0005539279 nova_compute[187514]: 2025-11-29 01:01:02.576 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:03 np0005539279 podman[218451]: 2025-11-29 01:01:03.874445534 +0000 UTC m=+0.095435453 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 20:01:03 np0005539279 podman[218450]: 2025-11-29 01:01:03.924183799 +0000 UTC m=+0.152052468 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 20:01:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:04.128 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:01:04 np0005539279 nova_compute[187514]: 2025-11-29 01:01:04.129 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:04 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:04.131 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 20:01:04 np0005539279 nova_compute[187514]: 2025-11-29 01:01:04.923 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:07 np0005539279 nova_compute[187514]: 2025-11-29 01:01:07.578 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:08.095 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:08.096 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:08.096 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:09 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:09.134 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:01:09 np0005539279 nova_compute[187514]: 2025-11-29 01:01:09.202 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:09 np0005539279 nova_compute[187514]: 2025-11-29 01:01:09.233 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:09 np0005539279 nova_compute[187514]: 2025-11-29 01:01:09.926 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:12 np0005539279 nova_compute[187514]: 2025-11-29 01:01:12.581 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:14 np0005539279 nova_compute[187514]: 2025-11-29 01:01:14.888 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764378059.8871183, 5b12c763-5c01-4955-847a-ffa225d6246a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:01:14 np0005539279 nova_compute[187514]: 2025-11-29 01:01:14.889 187518 INFO nova.compute.manager [-] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] VM Stopped (Lifecycle Event)#033[00m
Nov 28 20:01:14 np0005539279 nova_compute[187514]: 2025-11-29 01:01:14.925 187518 DEBUG nova.compute.manager [None req-0d395851-5216-4fa6-a9cf-b3716a6d1ef2 - - - - - -] [instance: 5b12c763-5c01-4955-847a-ffa225d6246a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:01:14 np0005539279 nova_compute[187514]: 2025-11-29 01:01:14.927 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:17 np0005539279 nova_compute[187514]: 2025-11-29 01:01:17.584 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:17 np0005539279 podman[218497]: 2025-11-29 01:01:17.873098958 +0000 UTC m=+0.118195673 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 20:01:17 np0005539279 podman[218498]: 2025-11-29 01:01:17.879971608 +0000 UTC m=+0.112519649 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 20:01:17 np0005539279 podman[218499]: 2025-11-29 01:01:17.903919994 +0000 UTC m=+0.132181050 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 20:01:19 np0005539279 nova_compute[187514]: 2025-11-29 01:01:19.929 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:21 np0005539279 nova_compute[187514]: 2025-11-29 01:01:21.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:22 np0005539279 nova_compute[187514]: 2025-11-29 01:01:22.587 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.643 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.644 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.644 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.645 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.925 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.926 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5783MB free_disk=73.3394546508789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.926 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.927 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.996 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:01:23 np0005539279 nova_compute[187514]: 2025-11-29 01:01:23.997 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:01:24 np0005539279 nova_compute[187514]: 2025-11-29 01:01:24.019 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:01:24 np0005539279 nova_compute[187514]: 2025-11-29 01:01:24.034 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:01:24 np0005539279 nova_compute[187514]: 2025-11-29 01:01:24.054 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:01:24 np0005539279 nova_compute[187514]: 2025-11-29 01:01:24.055 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:24 np0005539279 nova_compute[187514]: 2025-11-29 01:01:24.931 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:26 np0005539279 nova_compute[187514]: 2025-11-29 01:01:26.055 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:26 np0005539279 nova_compute[187514]: 2025-11-29 01:01:26.056 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:01:26 np0005539279 nova_compute[187514]: 2025-11-29 01:01:26.057 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:01:26 np0005539279 nova_compute[187514]: 2025-11-29 01:01:26.077 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:01:26 np0005539279 nova_compute[187514]: 2025-11-29 01:01:26.078 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:26 np0005539279 nova_compute[187514]: 2025-11-29 01:01:26.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:26 np0005539279 nova_compute[187514]: 2025-11-29 01:01:26.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:01:27 np0005539279 nova_compute[187514]: 2025-11-29 01:01:27.588 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:29 np0005539279 nova_compute[187514]: 2025-11-29 01:01:29.933 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:30 np0005539279 podman[218561]: 2025-11-29 01:01:30.513764055 +0000 UTC m=+0.076992727 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 20:01:30 np0005539279 podman[218560]: 2025-11-29 01:01:30.535922528 +0000 UTC m=+0.093618839 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 20:01:30 np0005539279 nova_compute[187514]: 2025-11-29 01:01:30.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:30 np0005539279 nova_compute[187514]: 2025-11-29 01:01:30.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:31 np0005539279 nova_compute[187514]: 2025-11-29 01:01:31.902 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "7eefcff2-7ea5-473c-90fc-3d2795b90204" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:31 np0005539279 nova_compute[187514]: 2025-11-29 01:01:31.903 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:31 np0005539279 nova_compute[187514]: 2025-11-29 01:01:31.924 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.036 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.037 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.046 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.046 187518 INFO nova.compute.claims [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.206 187518 DEBUG nova.compute.provider_tree [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.223 187518 DEBUG nova.scheduler.client.report [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.243 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.244 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.287 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.288 187518 DEBUG nova.network.neutron [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.305 187518 INFO nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:01:32.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.327 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.419 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.422 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.422 187518 INFO nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Creating image(s)#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.423 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.424 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.425 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.451 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.524 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.526 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.527 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.543 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.590 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.629 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.631 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.689 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.691 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.691 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.777 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.778 187518 DEBUG nova.virt.disk.api [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.779 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.867 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.868 187518 DEBUG nova.virt.disk.api [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.869 187518 DEBUG nova.objects.instance [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 7eefcff2-7ea5-473c-90fc-3d2795b90204 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.891 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.892 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Ensure instance console log exists: /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.892 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.893 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:32 np0005539279 nova_compute[187514]: 2025-11-29 01:01:32.894 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:33 np0005539279 nova_compute[187514]: 2025-11-29 01:01:33.465 187518 DEBUG nova.policy [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 20:01:33 np0005539279 nova_compute[187514]: 2025-11-29 01:01:33.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:01:34 np0005539279 podman[218619]: 2025-11-29 01:01:34.860839042 +0000 UTC m=+0.093703292 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Nov 28 20:01:34 np0005539279 podman[218618]: 2025-11-29 01:01:34.909935187 +0000 UTC m=+0.146854255 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:01:34 np0005539279 nova_compute[187514]: 2025-11-29 01:01:34.935 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:36 np0005539279 nova_compute[187514]: 2025-11-29 01:01:36.564 187518 DEBUG nova.network.neutron [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Successfully created port: 8d103d7a-a705-4e47-a6ae-db4ad541cd7b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 20:01:37 np0005539279 nova_compute[187514]: 2025-11-29 01:01:37.595 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:38 np0005539279 nova_compute[187514]: 2025-11-29 01:01:38.137 187518 DEBUG nova.network.neutron [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Successfully updated port: 8d103d7a-a705-4e47-a6ae-db4ad541cd7b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 20:01:38 np0005539279 nova_compute[187514]: 2025-11-29 01:01:38.158 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:01:38 np0005539279 nova_compute[187514]: 2025-11-29 01:01:38.159 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:01:38 np0005539279 nova_compute[187514]: 2025-11-29 01:01:38.159 187518 DEBUG nova.network.neutron [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 20:01:38 np0005539279 nova_compute[187514]: 2025-11-29 01:01:38.236 187518 DEBUG nova.compute.manager [req-c2e16c69-d73c-466a-a746-ebe645c68970 req-fde5fa80-76be-4517-9e8e-bda2ed21b3bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received event network-changed-8d103d7a-a705-4e47-a6ae-db4ad541cd7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:01:38 np0005539279 nova_compute[187514]: 2025-11-29 01:01:38.236 187518 DEBUG nova.compute.manager [req-c2e16c69-d73c-466a-a746-ebe645c68970 req-fde5fa80-76be-4517-9e8e-bda2ed21b3bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Refreshing instance network info cache due to event network-changed-8d103d7a-a705-4e47-a6ae-db4ad541cd7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:01:38 np0005539279 nova_compute[187514]: 2025-11-29 01:01:38.237 187518 DEBUG oslo_concurrency.lockutils [req-c2e16c69-d73c-466a-a746-ebe645c68970 req-fde5fa80-76be-4517-9e8e-bda2ed21b3bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:01:38 np0005539279 nova_compute[187514]: 2025-11-29 01:01:38.374 187518 DEBUG nova.network.neutron [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.789 187518 DEBUG nova.network.neutron [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Updating instance_info_cache with network_info: [{"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.817 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.817 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Instance network_info: |[{"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.818 187518 DEBUG oslo_concurrency.lockutils [req-c2e16c69-d73c-466a-a746-ebe645c68970 req-fde5fa80-76be-4517-9e8e-bda2ed21b3bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.818 187518 DEBUG nova.network.neutron [req-c2e16c69-d73c-466a-a746-ebe645c68970 req-fde5fa80-76be-4517-9e8e-bda2ed21b3bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Refreshing network info cache for port 8d103d7a-a705-4e47-a6ae-db4ad541cd7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.823 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Start _get_guest_xml network_info=[{"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.830 187518 WARNING nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.837 187518 DEBUG nova.virt.libvirt.host [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.838 187518 DEBUG nova.virt.libvirt.host [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.847 187518 DEBUG nova.virt.libvirt.host [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.848 187518 DEBUG nova.virt.libvirt.host [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.848 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.849 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.849 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.850 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.850 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.851 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.851 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.851 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.852 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.852 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.853 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.853 187518 DEBUG nova.virt.hardware [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.859 187518 DEBUG nova.virt.libvirt.vif [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1955499757',display_name='tempest-TestNetworkBasicOps-server-1955499757',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1955499757',id=10,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD1zbsqltls+Cz0dORBskqPEoOvS6HjRdnmbg9Y+rsYPCGEU2ob/aGtwfojLWxtx+pvDYfc8xuQQZrSek3zOHIQ9WhSkj1vFvVPEVUNoNlAGz8KA+yOB6lTRlul1hTBNCA==',key_name='tempest-TestNetworkBasicOps-1707786599',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-nu9f6z9i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:01:32Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=7eefcff2-7ea5-473c-90fc-3d2795b90204,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.860 187518 DEBUG nova.network.os_vif_util [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.861 187518 DEBUG nova.network.os_vif_util [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:9c:e0,bridge_name='br-int',has_traffic_filtering=True,id=8d103d7a-a705-4e47-a6ae-db4ad541cd7b,network=Network(3a2ade36-9baf-4002-8072-378d5e061a3b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d103d7a-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.862 187518 DEBUG nova.objects.instance [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7eefcff2-7ea5-473c-90fc-3d2795b90204 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.878 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] End _get_guest_xml xml=<domain type="kvm">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <uuid>7eefcff2-7ea5-473c-90fc-3d2795b90204</uuid>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <name>instance-0000000a</name>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-1955499757</nova:name>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 01:01:39</nova:creationTime>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        <nova:port uuid="8d103d7a-a705-4e47-a6ae-db4ad541cd7b">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <entry name="serial">7eefcff2-7ea5-473c-90fc-3d2795b90204</entry>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <entry name="uuid">7eefcff2-7ea5-473c-90fc-3d2795b90204</entry>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk.config"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:b6:9c:e0"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <target dev="tap8d103d7a-a7"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/console.log" append="off"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 20:01:39 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:01:39 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:01:39 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:01:39 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.880 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Preparing to wait for external event network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.881 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.882 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.882 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.883 187518 DEBUG nova.virt.libvirt.vif [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1955499757',display_name='tempest-TestNetworkBasicOps-server-1955499757',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1955499757',id=10,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD1zbsqltls+Cz0dORBskqPEoOvS6HjRdnmbg9Y+rsYPCGEU2ob/aGtwfojLWxtx+pvDYfc8xuQQZrSek3zOHIQ9WhSkj1vFvVPEVUNoNlAGz8KA+yOB6lTRlul1hTBNCA==',key_name='tempest-TestNetworkBasicOps-1707786599',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-nu9f6z9i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:01:32Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=7eefcff2-7ea5-473c-90fc-3d2795b90204,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.884 187518 DEBUG nova.network.os_vif_util [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.885 187518 DEBUG nova.network.os_vif_util [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b6:9c:e0,bridge_name='br-int',has_traffic_filtering=True,id=8d103d7a-a705-4e47-a6ae-db4ad541cd7b,network=Network(3a2ade36-9baf-4002-8072-378d5e061a3b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d103d7a-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.886 187518 DEBUG os_vif [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:9c:e0,bridge_name='br-int',has_traffic_filtering=True,id=8d103d7a-a705-4e47-a6ae-db4ad541cd7b,network=Network(3a2ade36-9baf-4002-8072-378d5e061a3b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d103d7a-a7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.887 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.888 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.888 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.894 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.895 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d103d7a-a7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.896 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8d103d7a-a7, col_values=(('external_ids', {'iface-id': '8d103d7a-a705-4e47-a6ae-db4ad541cd7b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b6:9c:e0', 'vm-uuid': '7eefcff2-7ea5-473c-90fc-3d2795b90204'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:01:39 np0005539279 NetworkManager[55703]: <info>  [1764378099.8997] manager: (tap8d103d7a-a7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.900 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.902 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.910 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.912 187518 INFO os_vif [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b6:9c:e0,bridge_name='br-int',has_traffic_filtering=True,id=8d103d7a-a705-4e47-a6ae-db4ad541cd7b,network=Network(3a2ade36-9baf-4002-8072-378d5e061a3b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d103d7a-a7')#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.976 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.977 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.977 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:b6:9c:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 20:01:39 np0005539279 nova_compute[187514]: 2025-11-29 01:01:39.978 187518 INFO nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Using config drive#033[00m
Nov 28 20:01:40 np0005539279 nova_compute[187514]: 2025-11-29 01:01:40.541 187518 INFO nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Creating config drive at /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk.config#033[00m
Nov 28 20:01:40 np0005539279 nova_compute[187514]: 2025-11-29 01:01:40.548 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppd_bmd44 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:01:40 np0005539279 nova_compute[187514]: 2025-11-29 01:01:40.678 187518 DEBUG oslo_concurrency.processutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppd_bmd44" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:01:40 np0005539279 kernel: tap8d103d7a-a7: entered promiscuous mode
Nov 28 20:01:40 np0005539279 NetworkManager[55703]: <info>  [1764378100.7644] manager: (tap8d103d7a-a7): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Nov 28 20:01:40 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:40Z|00136|binding|INFO|Claiming lport 8d103d7a-a705-4e47-a6ae-db4ad541cd7b for this chassis.
Nov 28 20:01:40 np0005539279 nova_compute[187514]: 2025-11-29 01:01:40.765 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:40 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:40Z|00137|binding|INFO|8d103d7a-a705-4e47-a6ae-db4ad541cd7b: Claiming fa:16:3e:b6:9c:e0 10.100.0.4
Nov 28 20:01:40 np0005539279 nova_compute[187514]: 2025-11-29 01:01:40.774 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.786 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:9c:e0 10.100.0.4'], port_security=['fa:16:3e:b6:9c:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7eefcff2-7ea5-473c-90fc-3d2795b90204', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a2ade36-9baf-4002-8072-378d5e061a3b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': '17868449-0ee2-46de-87e8-2a4f7e17950c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12be912d-3802-4cf6-9580-e39ed656ba9b, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=8d103d7a-a705-4e47-a6ae-db4ad541cd7b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.787 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 8d103d7a-a705-4e47-a6ae-db4ad541cd7b in datapath 3a2ade36-9baf-4002-8072-378d5e061a3b bound to our chassis#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.789 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3a2ade36-9baf-4002-8072-378d5e061a3b#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.806 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[73bee1c7-fadb-4f8b-8089-367d990a4a2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.808 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3a2ade36-91 in ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.811 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3a2ade36-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.811 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[8882179e-a2a5-4ad3-b3d9-f043f5f90421]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:40 np0005539279 systemd-udevd[218677]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.813 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[288090de-d2ac-4c10-a278-7783a0643d50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.833 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[7b46398e-30bd-497e-bd66-ce27503d6ddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:40 np0005539279 systemd-machined[153752]: New machine qemu-10-instance-0000000a.
Nov 28 20:01:40 np0005539279 NetworkManager[55703]: <info>  [1764378100.8400] device (tap8d103d7a-a7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 20:01:40 np0005539279 NetworkManager[55703]: <info>  [1764378100.8412] device (tap8d103d7a-a7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 20:01:40 np0005539279 nova_compute[187514]: 2025-11-29 01:01:40.859 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:40 np0005539279 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Nov 28 20:01:40 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:40Z|00138|binding|INFO|Setting lport 8d103d7a-a705-4e47-a6ae-db4ad541cd7b ovn-installed in OVS
Nov 28 20:01:40 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:40Z|00139|binding|INFO|Setting lport 8d103d7a-a705-4e47-a6ae-db4ad541cd7b up in Southbound
Nov 28 20:01:40 np0005539279 nova_compute[187514]: 2025-11-29 01:01:40.865 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.868 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[15667453-56a8-43aa-bab3-d052d214fb3c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.907 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[5d388f4f-bd00-4d79-85a5-3ce2d14c78f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.915 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b88d90-9e1f-4937-af9c-3c8a255172a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:40 np0005539279 NetworkManager[55703]: <info>  [1764378100.9168] manager: (tap3a2ade36-90): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.965 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[94113790-c544-4ddd-9867-8aee967f339c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:40 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:40.969 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[de668d21-a602-4ca0-bdf9-fe041c42d469]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:41 np0005539279 NetworkManager[55703]: <info>  [1764378101.0069] device (tap3a2ade36-90): carrier: link connected
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.014 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[3af01339-406e-41a2-bda1-b5fe478e3ff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.041 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[96abe20f-961c-456d-931c-db670499a164]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a2ade36-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:75:ad:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403471, 'reachable_time': 28270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218712, 'error': None, 'target': 'ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.065 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[57a9b0d6-e706-4002-946a-03ac83d4c317]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe75:adcd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403471, 'tstamp': 403471}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218713, 'error': None, 'target': 'ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.095 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0d202712-d9da-4e7f-ba87-7f868053d830]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3a2ade36-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:75:ad:cd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403471, 'reachable_time': 28270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218714, 'error': None, 'target': 'ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.153 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2eb1ed77-1117-4136-9d0e-01afc81ed0b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.238 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378101.238188, 7eefcff2-7ea5-473c-90fc-3d2795b90204 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.239 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] VM Started (Lifecycle Event)#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.265 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.271 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378101.2383099, 7eefcff2-7ea5-473c-90fc-3d2795b90204 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.271 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] VM Paused (Lifecycle Event)#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.273 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[8ebe5e53-10ca-43ae-a7ca-69ec8298c551]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.275 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a2ade36-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.275 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.276 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a2ade36-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:01:41 np0005539279 NetworkManager[55703]: <info>  [1764378101.2785] manager: (tap3a2ade36-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.277 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:41 np0005539279 kernel: tap3a2ade36-90: entered promiscuous mode
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.284 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3a2ade36-90, col_values=(('external_ids', {'iface-id': 'e6e2e803-b2d9-4cee-9305-279fbb83425a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.285 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:41 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:41Z|00140|binding|INFO|Releasing lport e6e2e803-b2d9-4cee-9305-279fbb83425a from this chassis (sb_readonly=0)
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.295 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.300 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.310 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.311 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3a2ade36-9baf-4002-8072-378d5e061a3b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3a2ade36-9baf-4002-8072-378d5e061a3b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.313 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[5325742a-34c7-4147-b889-5191412b6cd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.314 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-3a2ade36-9baf-4002-8072-378d5e061a3b
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/3a2ade36-9baf-4002-8072-378d5e061a3b.pid.haproxy
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 3a2ade36-9baf-4002-8072-378d5e061a3b
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 20:01:41 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:01:41.315 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b', 'env', 'PROCESS_TAG=haproxy-3a2ade36-9baf-4002-8072-378d5e061a3b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3a2ade36-9baf-4002-8072-378d5e061a3b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.322 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.500 187518 DEBUG nova.compute.manager [req-3765a3f1-696c-4a97-88cf-ba01e38c37db req-5df3069f-2f56-4ee2-90e2-172d1b8cc84c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received event network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.501 187518 DEBUG oslo_concurrency.lockutils [req-3765a3f1-696c-4a97-88cf-ba01e38c37db req-5df3069f-2f56-4ee2-90e2-172d1b8cc84c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.502 187518 DEBUG oslo_concurrency.lockutils [req-3765a3f1-696c-4a97-88cf-ba01e38c37db req-5df3069f-2f56-4ee2-90e2-172d1b8cc84c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.502 187518 DEBUG oslo_concurrency.lockutils [req-3765a3f1-696c-4a97-88cf-ba01e38c37db req-5df3069f-2f56-4ee2-90e2-172d1b8cc84c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.502 187518 DEBUG nova.compute.manager [req-3765a3f1-696c-4a97-88cf-ba01e38c37db req-5df3069f-2f56-4ee2-90e2-172d1b8cc84c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Processing event network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.503 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.515 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378101.515023, 7eefcff2-7ea5-473c-90fc-3d2795b90204 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.515 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] VM Resumed (Lifecycle Event)#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.518 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.523 187518 INFO nova.virt.libvirt.driver [-] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Instance spawned successfully.#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.523 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.541 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.545 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.557 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.558 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.559 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.559 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.560 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.560 187518 DEBUG nova.virt.libvirt.driver [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.570 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.625 187518 INFO nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Took 9.21 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.626 187518 DEBUG nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.695 187518 INFO nova.compute.manager [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Took 9.72 seconds to build instance.#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.713 187518 DEBUG oslo_concurrency.lockutils [None req-7ffce1cf-46c3-4832-bb96-1b573dba8bbe 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:41 np0005539279 podman[218753]: 2025-11-29 01:01:41.800871205 +0000 UTC m=+0.085604557 container create 6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.804 187518 DEBUG nova.network.neutron [req-c2e16c69-d73c-466a-a746-ebe645c68970 req-fde5fa80-76be-4517-9e8e-bda2ed21b3bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Updated VIF entry in instance network info cache for port 8d103d7a-a705-4e47-a6ae-db4ad541cd7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.805 187518 DEBUG nova.network.neutron [req-c2e16c69-d73c-466a-a746-ebe645c68970 req-fde5fa80-76be-4517-9e8e-bda2ed21b3bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Updating instance_info_cache with network_info: [{"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:01:41 np0005539279 nova_compute[187514]: 2025-11-29 01:01:41.822 187518 DEBUG oslo_concurrency.lockutils [req-c2e16c69-d73c-466a-a746-ebe645c68970 req-fde5fa80-76be-4517-9e8e-bda2ed21b3bb 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:01:41 np0005539279 systemd[1]: Started libpod-conmon-6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f.scope.
Nov 28 20:01:41 np0005539279 podman[218753]: 2025-11-29 01:01:41.759432952 +0000 UTC m=+0.044166394 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 20:01:41 np0005539279 systemd[1]: Started libcrun container.
Nov 28 20:01:41 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a865a30b118897cf02bee71ad67a0561576ee9b33d0ffb755f00c5f891b85d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 20:01:41 np0005539279 podman[218753]: 2025-11-29 01:01:41.908822439 +0000 UTC m=+0.193555801 container init 6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 20:01:41 np0005539279 podman[218753]: 2025-11-29 01:01:41.919837589 +0000 UTC m=+0.204570951 container start 6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:01:41 np0005539279 neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b[218768]: [NOTICE]   (218773) : New worker (218775) forked
Nov 28 20:01:41 np0005539279 neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b[218768]: [NOTICE]   (218773) : Loading success.
Nov 28 20:01:42 np0005539279 nova_compute[187514]: 2025-11-29 01:01:42.600 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:43 np0005539279 nova_compute[187514]: 2025-11-29 01:01:43.625 187518 DEBUG nova.compute.manager [req-2e10c1fc-c0c8-4b7d-a301-bb483bf14387 req-ba84b266-35c1-4969-9a1a-55450a71c9da 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received event network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:01:43 np0005539279 nova_compute[187514]: 2025-11-29 01:01:43.626 187518 DEBUG oslo_concurrency.lockutils [req-2e10c1fc-c0c8-4b7d-a301-bb483bf14387 req-ba84b266-35c1-4969-9a1a-55450a71c9da 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:01:43 np0005539279 nova_compute[187514]: 2025-11-29 01:01:43.626 187518 DEBUG oslo_concurrency.lockutils [req-2e10c1fc-c0c8-4b7d-a301-bb483bf14387 req-ba84b266-35c1-4969-9a1a-55450a71c9da 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:01:43 np0005539279 nova_compute[187514]: 2025-11-29 01:01:43.627 187518 DEBUG oslo_concurrency.lockutils [req-2e10c1fc-c0c8-4b7d-a301-bb483bf14387 req-ba84b266-35c1-4969-9a1a-55450a71c9da 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:01:43 np0005539279 nova_compute[187514]: 2025-11-29 01:01:43.627 187518 DEBUG nova.compute.manager [req-2e10c1fc-c0c8-4b7d-a301-bb483bf14387 req-ba84b266-35c1-4969-9a1a-55450a71c9da 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] No waiting events found dispatching network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:01:43 np0005539279 nova_compute[187514]: 2025-11-29 01:01:43.628 187518 WARNING nova.compute.manager [req-2e10c1fc-c0c8-4b7d-a301-bb483bf14387 req-ba84b266-35c1-4969-9a1a-55450a71c9da 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received unexpected event network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b for instance with vm_state active and task_state None.#033[00m
Nov 28 20:01:44 np0005539279 nova_compute[187514]: 2025-11-29 01:01:44.899 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:47 np0005539279 NetworkManager[55703]: <info>  [1764378107.4691] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Nov 28 20:01:47 np0005539279 NetworkManager[55703]: <info>  [1764378107.4709] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 28 20:01:47 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:47Z|00141|binding|INFO|Releasing lport e6e2e803-b2d9-4cee-9305-279fbb83425a from this chassis (sb_readonly=0)
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.478 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:47 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:47Z|00142|binding|INFO|Releasing lport e6e2e803-b2d9-4cee-9305-279fbb83425a from this chassis (sb_readonly=0)
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.524 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.534 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.601 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.766 187518 DEBUG nova.compute.manager [req-f7021599-57c5-4a21-ae27-594e6820394c req-c4455d30-edcf-471f-9e75-a732296fbe0b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received event network-changed-8d103d7a-a705-4e47-a6ae-db4ad541cd7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.767 187518 DEBUG nova.compute.manager [req-f7021599-57c5-4a21-ae27-594e6820394c req-c4455d30-edcf-471f-9e75-a732296fbe0b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Refreshing instance network info cache due to event network-changed-8d103d7a-a705-4e47-a6ae-db4ad541cd7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.768 187518 DEBUG oslo_concurrency.lockutils [req-f7021599-57c5-4a21-ae27-594e6820394c req-c4455d30-edcf-471f-9e75-a732296fbe0b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.768 187518 DEBUG oslo_concurrency.lockutils [req-f7021599-57c5-4a21-ae27-594e6820394c req-c4455d30-edcf-471f-9e75-a732296fbe0b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:01:47 np0005539279 nova_compute[187514]: 2025-11-29 01:01:47.769 187518 DEBUG nova.network.neutron [req-f7021599-57c5-4a21-ae27-594e6820394c req-c4455d30-edcf-471f-9e75-a732296fbe0b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Refreshing network info cache for port 8d103d7a-a705-4e47-a6ae-db4ad541cd7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:01:48 np0005539279 podman[218791]: 2025-11-29 01:01:48.870193403 +0000 UTC m=+0.100549741 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:01:48 np0005539279 podman[218789]: 2025-11-29 01:01:48.870804931 +0000 UTC m=+0.114756683 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:01:48 np0005539279 podman[218790]: 2025-11-29 01:01:48.891893353 +0000 UTC m=+0.124232898 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 20:01:49 np0005539279 nova_compute[187514]: 2025-11-29 01:01:49.689 187518 DEBUG nova.network.neutron [req-f7021599-57c5-4a21-ae27-594e6820394c req-c4455d30-edcf-471f-9e75-a732296fbe0b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Updated VIF entry in instance network info cache for port 8d103d7a-a705-4e47-a6ae-db4ad541cd7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:01:49 np0005539279 nova_compute[187514]: 2025-11-29 01:01:49.691 187518 DEBUG nova.network.neutron [req-f7021599-57c5-4a21-ae27-594e6820394c req-c4455d30-edcf-471f-9e75-a732296fbe0b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Updating instance_info_cache with network_info: [{"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:01:49 np0005539279 nova_compute[187514]: 2025-11-29 01:01:49.718 187518 DEBUG oslo_concurrency.lockutils [req-f7021599-57c5-4a21-ae27-594e6820394c req-c4455d30-edcf-471f-9e75-a732296fbe0b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:01:49 np0005539279 nova_compute[187514]: 2025-11-29 01:01:49.948 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:52 np0005539279 nova_compute[187514]: 2025-11-29 01:01:52.603 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:53 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:53Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b6:9c:e0 10.100.0.4
Nov 28 20:01:53 np0005539279 ovn_controller[95686]: 2025-11-29T01:01:53Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b6:9c:e0 10.100.0.4
Nov 28 20:01:54 np0005539279 nova_compute[187514]: 2025-11-29 01:01:54.951 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:57 np0005539279 nova_compute[187514]: 2025-11-29 01:01:57.605 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:01:59 np0005539279 nova_compute[187514]: 2025-11-29 01:01:59.955 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:00 np0005539279 nova_compute[187514]: 2025-11-29 01:02:00.231 187518 INFO nova.compute.manager [None req-680add0c-6f27-4ca8-aed7-43c42e1ef9a2 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Get console output#033[00m
Nov 28 20:02:00 np0005539279 nova_compute[187514]: 2025-11-29 01:02:00.238 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 20:02:00 np0005539279 podman[218878]: 2025-11-29 01:02:00.858237674 +0000 UTC m=+0.102342452 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6)
Nov 28 20:02:00 np0005539279 podman[218879]: 2025-11-29 01:02:00.888233085 +0000 UTC m=+0.121955911 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 20:02:01 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:01Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b6:9c:e0 10.100.0.4
Nov 28 20:02:02 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:02Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b6:9c:e0 10.100.0.4
Nov 28 20:02:02 np0005539279 nova_compute[187514]: 2025-11-29 01:02:02.608 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:04 np0005539279 nova_compute[187514]: 2025-11-29 01:02:04.957 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:05 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:05.759 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:02:05 np0005539279 nova_compute[187514]: 2025-11-29 01:02:05.759 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:05 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:05.761 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 20:02:05 np0005539279 podman[218929]: 2025-11-29 01:02:05.847502468 +0000 UTC m=+0.082320422 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 20:02:05 np0005539279 podman[218928]: 2025-11-29 01:02:05.91097705 +0000 UTC m=+0.146479084 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.070 187518 DEBUG nova.compute.manager [req-6b780cad-41fb-4f47-b6c8-b7d072ca96fd req-6c3c6c5f-5ca0-44c2-82c6-b7195cb062ee 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received event network-changed-8d103d7a-a705-4e47-a6ae-db4ad541cd7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.070 187518 DEBUG nova.compute.manager [req-6b780cad-41fb-4f47-b6c8-b7d072ca96fd req-6c3c6c5f-5ca0-44c2-82c6-b7195cb062ee 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Refreshing instance network info cache due to event network-changed-8d103d7a-a705-4e47-a6ae-db4ad541cd7b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.071 187518 DEBUG oslo_concurrency.lockutils [req-6b780cad-41fb-4f47-b6c8-b7d072ca96fd req-6c3c6c5f-5ca0-44c2-82c6-b7195cb062ee 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.071 187518 DEBUG oslo_concurrency.lockutils [req-6b780cad-41fb-4f47-b6c8-b7d072ca96fd req-6c3c6c5f-5ca0-44c2-82c6-b7195cb062ee 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.072 187518 DEBUG nova.network.neutron [req-6b780cad-41fb-4f47-b6c8-b7d072ca96fd req-6c3c6c5f-5ca0-44c2-82c6-b7195cb062ee 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Refreshing network info cache for port 8d103d7a-a705-4e47-a6ae-db4ad541cd7b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.102 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "7eefcff2-7ea5-473c-90fc-3d2795b90204" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.102 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.103 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.104 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.104 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.106 187518 INFO nova.compute.manager [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Terminating instance#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.108 187518 DEBUG nova.compute.manager [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 20:02:06 np0005539279 kernel: tap8d103d7a-a7 (unregistering): left promiscuous mode
Nov 28 20:02:06 np0005539279 NetworkManager[55703]: <info>  [1764378126.1346] device (tap8d103d7a-a7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.144 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:06 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:06Z|00143|binding|INFO|Releasing lport 8d103d7a-a705-4e47-a6ae-db4ad541cd7b from this chassis (sb_readonly=0)
Nov 28 20:02:06 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:06Z|00144|binding|INFO|Setting lport 8d103d7a-a705-4e47-a6ae-db4ad541cd7b down in Southbound
Nov 28 20:02:06 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:06Z|00145|binding|INFO|Removing iface tap8d103d7a-a7 ovn-installed in OVS
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.147 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.151 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:9c:e0 10.100.0.4'], port_security=['fa:16:3e:b6:9c:e0 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '7eefcff2-7ea5-473c-90fc-3d2795b90204', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a2ade36-9baf-4002-8072-378d5e061a3b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': '17868449-0ee2-46de-87e8-2a4f7e17950c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12be912d-3802-4cf6-9580-e39ed656ba9b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=8d103d7a-a705-4e47-a6ae-db4ad541cd7b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.152 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 8d103d7a-a705-4e47-a6ae-db4ad541cd7b in datapath 3a2ade36-9baf-4002-8072-378d5e061a3b unbound from our chassis#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.153 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3a2ade36-9baf-4002-8072-378d5e061a3b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.154 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[49e66d66-bd47-403b-bf3d-ae7a9fe60fe8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.155 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b namespace which is not needed anymore#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.164 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:06 np0005539279 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 28 20:02:06 np0005539279 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 13.001s CPU time.
Nov 28 20:02:06 np0005539279 systemd-machined[153752]: Machine qemu-10-instance-0000000a terminated.
Nov 28 20:02:06 np0005539279 neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b[218768]: [NOTICE]   (218773) : haproxy version is 2.8.14-c23fe91
Nov 28 20:02:06 np0005539279 neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b[218768]: [NOTICE]   (218773) : path to executable is /usr/sbin/haproxy
Nov 28 20:02:06 np0005539279 neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b[218768]: [WARNING]  (218773) : Exiting Master process...
Nov 28 20:02:06 np0005539279 neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b[218768]: [ALERT]    (218773) : Current worker (218775) exited with code 143 (Terminated)
Nov 28 20:02:06 np0005539279 neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b[218768]: [WARNING]  (218773) : All workers exited. Exiting... (0)
Nov 28 20:02:06 np0005539279 systemd[1]: libpod-6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f.scope: Deactivated successfully.
Nov 28 20:02:06 np0005539279 podman[218995]: 2025-11-29 01:02:06.35249552 +0000 UTC m=+0.056134521 container died 6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 20:02:06 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f-userdata-shm.mount: Deactivated successfully.
Nov 28 20:02:06 np0005539279 systemd[1]: var-lib-containers-storage-overlay-3a865a30b118897cf02bee71ad67a0561576ee9b33d0ffb755f00c5f891b85d2-merged.mount: Deactivated successfully.
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.399 187518 INFO nova.virt.libvirt.driver [-] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Instance destroyed successfully.#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.400 187518 DEBUG nova.objects.instance [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid 7eefcff2-7ea5-473c-90fc-3d2795b90204 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:02:06 np0005539279 podman[218995]: 2025-11-29 01:02:06.403572073 +0000 UTC m=+0.107211064 container cleanup 6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.414 187518 DEBUG nova.virt.libvirt.vif [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T01:01:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1955499757',display_name='tempest-TestNetworkBasicOps-server-1955499757',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1955499757',id=10,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD1zbsqltls+Cz0dORBskqPEoOvS6HjRdnmbg9Y+rsYPCGEU2ob/aGtwfojLWxtx+pvDYfc8xuQQZrSek3zOHIQ9WhSkj1vFvVPEVUNoNlAGz8KA+yOB6lTRlul1hTBNCA==',key_name='tempest-TestNetworkBasicOps-1707786599',keypairs=<?>,launch_index=0,launched_at=2025-11-29T01:01:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-nu9f6z9i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T01:01:41Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=7eefcff2-7ea5-473c-90fc-3d2795b90204,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.414 187518 DEBUG nova.network.os_vif_util [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.415 187518 DEBUG nova.network.os_vif_util [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b6:9c:e0,bridge_name='br-int',has_traffic_filtering=True,id=8d103d7a-a705-4e47-a6ae-db4ad541cd7b,network=Network(3a2ade36-9baf-4002-8072-378d5e061a3b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d103d7a-a7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.415 187518 DEBUG os_vif [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:9c:e0,bridge_name='br-int',has_traffic_filtering=True,id=8d103d7a-a705-4e47-a6ae-db4ad541cd7b,network=Network(3a2ade36-9baf-4002-8072-378d5e061a3b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d103d7a-a7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.417 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.418 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d103d7a-a7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.420 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.421 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.427 187518 INFO os_vif [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b6:9c:e0,bridge_name='br-int',has_traffic_filtering=True,id=8d103d7a-a705-4e47-a6ae-db4ad541cd7b,network=Network(3a2ade36-9baf-4002-8072-378d5e061a3b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8d103d7a-a7')#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.428 187518 INFO nova.virt.libvirt.driver [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Deleting instance files /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204_del#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.428 187518 INFO nova.virt.libvirt.driver [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Deletion of /var/lib/nova/instances/7eefcff2-7ea5-473c-90fc-3d2795b90204_del complete#033[00m
Nov 28 20:02:06 np0005539279 systemd[1]: libpod-conmon-6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f.scope: Deactivated successfully.
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.484 187518 INFO nova.compute.manager [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.484 187518 DEBUG oslo.service.loopingcall [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.484 187518 DEBUG nova.compute.manager [-] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.484 187518 DEBUG nova.network.neutron [-] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 20:02:06 np0005539279 podman[219043]: 2025-11-29 01:02:06.491318841 +0000 UTC m=+0.056695688 container remove 6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.501 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3b55a2-c76f-4142-83e6-6e7ccb23a2e0]: (4, ('Sat Nov 29 01:02:06 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b (6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f)\n6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f\nSat Nov 29 01:02:06 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b (6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f)\n6de905757a53f83d79fb9f6779d5e1f310ebc6c722d6bc6e61091c2cf41e6b3f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.503 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[0a802614-89db-4d79-ae03-be17e5416f1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.504 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a2ade36-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:06 np0005539279 kernel: tap3a2ade36-90: left promiscuous mode
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.509 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:06 np0005539279 nova_compute[187514]: 2025-11-29 01:02:06.520 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.524 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e006f2bd-b805-4a37-8dad-b3274ccedd00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.543 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2443b07b-a5e0-40c0-94e7-a9a6ffb2daae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.545 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e797ac9b-951e-4c32-b586-593d822a8c96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.568 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[00bc741b-9901-4bb6-8d20-3ec57648f0f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403460, 'reachable_time': 41227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219058, 'error': None, 'target': 'ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.571 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3a2ade36-9baf-4002-8072-378d5e061a3b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 20:02:06 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:06.571 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa2862e-c153-453b-9d80-96b43017dea6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:06 np0005539279 systemd[1]: run-netns-ovnmeta\x2d3a2ade36\x2d9baf\x2d4002\x2d8072\x2d378d5e061a3b.mount: Deactivated successfully.
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.071 187518 DEBUG nova.network.neutron [-] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.089 187518 INFO nova.compute.manager [-] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Took 0.60 seconds to deallocate network for instance.#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.137 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.138 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.206 187518 DEBUG nova.compute.provider_tree [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.225 187518 DEBUG nova.scheduler.client.report [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.244 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.271 187518 INFO nova.scheduler.client.report [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance 7eefcff2-7ea5-473c-90fc-3d2795b90204#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.299 187518 DEBUG nova.network.neutron [req-6b780cad-41fb-4f47-b6c8-b7d072ca96fd req-6c3c6c5f-5ca0-44c2-82c6-b7195cb062ee 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Updated VIF entry in instance network info cache for port 8d103d7a-a705-4e47-a6ae-db4ad541cd7b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.299 187518 DEBUG nova.network.neutron [req-6b780cad-41fb-4f47-b6c8-b7d072ca96fd req-6c3c6c5f-5ca0-44c2-82c6-b7195cb062ee 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Updating instance_info_cache with network_info: [{"id": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "address": "fa:16:3e:b6:9c:e0", "network": {"id": "3a2ade36-9baf-4002-8072-378d5e061a3b", "bridge": "br-int", "label": "tempest-network-smoke--1823836960", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8d103d7a-a7", "ovs_interfaceid": "8d103d7a-a705-4e47-a6ae-db4ad541cd7b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.324 187518 DEBUG oslo_concurrency.lockutils [req-6b780cad-41fb-4f47-b6c8-b7d072ca96fd req-6c3c6c5f-5ca0-44c2-82c6-b7195cb062ee 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-7eefcff2-7ea5-473c-90fc-3d2795b90204" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.366 187518 DEBUG oslo_concurrency.lockutils [None req-7125a2e4-04f8-4e81-b684-cf2ffaf06588 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:07 np0005539279 nova_compute[187514]: 2025-11-29 01:02:07.610 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:08.096 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:08.097 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:08.097 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.186 187518 DEBUG nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received event network-vif-unplugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.186 187518 DEBUG oslo_concurrency.lockutils [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.187 187518 DEBUG oslo_concurrency.lockutils [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.187 187518 DEBUG oslo_concurrency.lockutils [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.188 187518 DEBUG nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] No waiting events found dispatching network-vif-unplugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.188 187518 WARNING nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received unexpected event network-vif-unplugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b for instance with vm_state deleted and task_state None.#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.188 187518 DEBUG nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received event network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.189 187518 DEBUG oslo_concurrency.lockutils [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.189 187518 DEBUG oslo_concurrency.lockutils [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.190 187518 DEBUG oslo_concurrency.lockutils [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "7eefcff2-7ea5-473c-90fc-3d2795b90204-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.190 187518 DEBUG nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] No waiting events found dispatching network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.191 187518 WARNING nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received unexpected event network-vif-plugged-8d103d7a-a705-4e47-a6ae-db4ad541cd7b for instance with vm_state deleted and task_state None.#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.191 187518 DEBUG nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Received event network-vif-deleted-8d103d7a-a705-4e47-a6ae-db4ad541cd7b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.191 187518 INFO nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Neutron deleted interface 8d103d7a-a705-4e47-a6ae-db4ad541cd7b; detaching it from the instance and deleting it from the info cache#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.192 187518 DEBUG nova.network.neutron [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 28 20:02:08 np0005539279 nova_compute[187514]: 2025-11-29 01:02:08.194 187518 DEBUG nova.compute.manager [req-64ff8c22-f5e8-48db-96a6-a3ab9acbf78e req-6b17ffb7-f76a-403e-9c3a-ac3c4e0ae3b4 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Detach interface failed, port_id=8d103d7a-a705-4e47-a6ae-db4ad541cd7b, reason: Instance 7eefcff2-7ea5-473c-90fc-3d2795b90204 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 28 20:02:11 np0005539279 nova_compute[187514]: 2025-11-29 01:02:11.423 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:12 np0005539279 nova_compute[187514]: 2025-11-29 01:02:12.030 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:12 np0005539279 nova_compute[187514]: 2025-11-29 01:02:12.131 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:12 np0005539279 nova_compute[187514]: 2025-11-29 01:02:12.613 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:12 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:12.763 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:16 np0005539279 nova_compute[187514]: 2025-11-29 01:02:16.426 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:17 np0005539279 nova_compute[187514]: 2025-11-29 01:02:17.615 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:19 np0005539279 podman[219066]: 2025-11-29 01:02:19.854769037 +0000 UTC m=+0.080014144 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 20:02:19 np0005539279 podman[219064]: 2025-11-29 01:02:19.871570075 +0000 UTC m=+0.103520777 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 20:02:19 np0005539279 podman[219065]: 2025-11-29 01:02:19.880151314 +0000 UTC m=+0.110556051 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:02:21 np0005539279 nova_compute[187514]: 2025-11-29 01:02:21.398 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764378126.396276, 7eefcff2-7ea5-473c-90fc-3d2795b90204 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:02:21 np0005539279 nova_compute[187514]: 2025-11-29 01:02:21.399 187518 INFO nova.compute.manager [-] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] VM Stopped (Lifecycle Event)#033[00m
Nov 28 20:02:21 np0005539279 nova_compute[187514]: 2025-11-29 01:02:21.420 187518 DEBUG nova.compute.manager [None req-4d4430b9-61a4-4c5f-849c-98dd8bf0bbcc - - - - - -] [instance: 7eefcff2-7ea5-473c-90fc-3d2795b90204] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:21 np0005539279 nova_compute[187514]: 2025-11-29 01:02:21.429 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:22 np0005539279 nova_compute[187514]: 2025-11-29 01:02:22.618 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:23 np0005539279 nova_compute[187514]: 2025-11-29 01:02:23.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:24 np0005539279 nova_compute[187514]: 2025-11-29 01:02:24.897 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:24 np0005539279 nova_compute[187514]: 2025-11-29 01:02:24.898 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:24 np0005539279 nova_compute[187514]: 2025-11-29 01:02:24.923 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.015 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.016 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.026 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.026 187518 INFO nova.compute.claims [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.157 187518 DEBUG nova.compute.provider_tree [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.179 187518 DEBUG nova.scheduler.client.report [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.199 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.200 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.253 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.254 187518 DEBUG nova.network.neutron [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.273 187518 INFO nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.298 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.388 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.390 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.391 187518 INFO nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Creating image(s)#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.392 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.392 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.393 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.417 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.505 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.506 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.507 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.521 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.615 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.616 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.665 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.666 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.667 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.667 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.672 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.673 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.674 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.750 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.752 187518 DEBUG nova.virt.disk.api [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.752 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.823 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.825 187518 DEBUG nova.virt.disk.api [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.825 187518 DEBUG nova.objects.instance [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 4934aff6-60d5-416e-97b3-bed2dbe82055 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.925 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.927 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5763MB free_disk=73.33824157714844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.927 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:25 np0005539279 nova_compute[187514]: 2025-11-29 01:02:25.928 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.030 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.031 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Ensure instance console log exists: /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.031 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.032 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.032 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.082 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Instance 4934aff6-60d5-416e-97b3-bed2dbe82055 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.083 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.083 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.120 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.132 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.152 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.152 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.432 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:26 np0005539279 nova_compute[187514]: 2025-11-29 01:02:26.498 187518 DEBUG nova.policy [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 20:02:27 np0005539279 nova_compute[187514]: 2025-11-29 01:02:27.621 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.154 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.154 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.155 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.179 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.179 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.180 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.627 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.627 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:02:28 np0005539279 nova_compute[187514]: 2025-11-29 01:02:28.802 187518 DEBUG nova.network.neutron [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Successfully created port: 4b2bcfa2-03bd-4475-b868-15c0531e30d4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 20:02:29 np0005539279 nova_compute[187514]: 2025-11-29 01:02:29.472 187518 DEBUG nova.network.neutron [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Successfully updated port: 4b2bcfa2-03bd-4475-b868-15c0531e30d4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 20:02:29 np0005539279 nova_compute[187514]: 2025-11-29 01:02:29.490 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:02:29 np0005539279 nova_compute[187514]: 2025-11-29 01:02:29.490 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:02:29 np0005539279 nova_compute[187514]: 2025-11-29 01:02:29.491 187518 DEBUG nova.network.neutron [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 20:02:29 np0005539279 nova_compute[187514]: 2025-11-29 01:02:29.586 187518 DEBUG nova.compute.manager [req-6f55512b-ad7a-43d0-a2c4-3c0105af7b49 req-8588c3fa-a5a8-4a90-be1a-a074795b443a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:29 np0005539279 nova_compute[187514]: 2025-11-29 01:02:29.586 187518 DEBUG nova.compute.manager [req-6f55512b-ad7a-43d0-a2c4-3c0105af7b49 req-8588c3fa-a5a8-4a90-be1a-a074795b443a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing instance network info cache due to event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:02:29 np0005539279 nova_compute[187514]: 2025-11-29 01:02:29.587 187518 DEBUG oslo_concurrency.lockutils [req-6f55512b-ad7a-43d0-a2c4-3c0105af7b49 req-8588c3fa-a5a8-4a90-be1a-a074795b443a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:02:29 np0005539279 nova_compute[187514]: 2025-11-29 01:02:29.673 187518 DEBUG nova.network.neutron [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 20:02:31 np0005539279 nova_compute[187514]: 2025-11-29 01:02:31.436 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:31 np0005539279 nova_compute[187514]: 2025-11-29 01:02:31.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:31 np0005539279 podman[219139]: 2025-11-29 01:02:31.832103979 +0000 UTC m=+0.076934394 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 20:02:31 np0005539279 podman[219140]: 2025-11-29 01:02:31.852760229 +0000 UTC m=+0.087614865 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.185 187518 DEBUG nova.network.neutron [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updating instance_info_cache with network_info: [{"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.216 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.216 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Instance network_info: |[{"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.217 187518 DEBUG oslo_concurrency.lockutils [req-6f55512b-ad7a-43d0-a2c4-3c0105af7b49 req-8588c3fa-a5a8-4a90-be1a-a074795b443a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.218 187518 DEBUG nova.network.neutron [req-6f55512b-ad7a-43d0-a2c4-3c0105af7b49 req-8588c3fa-a5a8-4a90-be1a-a074795b443a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.223 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Start _get_guest_xml network_info=[{"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.230 187518 WARNING nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.241 187518 DEBUG nova.virt.libvirt.host [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.242 187518 DEBUG nova.virt.libvirt.host [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.249 187518 DEBUG nova.virt.libvirt.host [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.250 187518 DEBUG nova.virt.libvirt.host [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.251 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.251 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.252 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.253 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.253 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.253 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.254 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.254 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.255 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.255 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.255 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.256 187518 DEBUG nova.virt.hardware [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.262 187518 DEBUG nova.virt.libvirt.vif [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:02:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1184505323',display_name='tempest-TestNetworkBasicOps-server-1184505323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1184505323',id=11,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKg+pcfICyoy0PdUh7IAeh34OcMnvlgfDaE12t4Y/9/QjK5SA5Wnh43QNi8YWI2ahCwmMnrIQ65ramNfTIFEmyqXTFXzbnkjufkp+4+PrWJhxjICuVaXjNX4nlafiL43BQ==',key_name='tempest-TestNetworkBasicOps-1133988200',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-bkbw0tu9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:02:25Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=4934aff6-60d5-416e-97b3-bed2dbe82055,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.262 187518 DEBUG nova.network.os_vif_util [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.264 187518 DEBUG nova.network.os_vif_util [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:25:7c,bridge_name='br-int',has_traffic_filtering=True,id=4b2bcfa2-03bd-4475-b868-15c0531e30d4,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2bcfa2-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.265 187518 DEBUG nova.objects.instance [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4934aff6-60d5-416e-97b3-bed2dbe82055 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.280 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] End _get_guest_xml xml=<domain type="kvm">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <uuid>4934aff6-60d5-416e-97b3-bed2dbe82055</uuid>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <name>instance-0000000b</name>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-1184505323</nova:name>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 01:02:32</nova:creationTime>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        <nova:port uuid="4b2bcfa2-03bd-4475-b868-15c0531e30d4">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <entry name="serial">4934aff6-60d5-416e-97b3-bed2dbe82055</entry>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <entry name="uuid">4934aff6-60d5-416e-97b3-bed2dbe82055</entry>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk.config"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:8c:25:7c"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <target dev="tap4b2bcfa2-03"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/console.log" append="off"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 20:02:32 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:02:32 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:02:32 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:02:32 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.281 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Preparing to wait for external event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.282 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.282 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.283 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.283 187518 DEBUG nova.virt.libvirt.vif [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:02:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1184505323',display_name='tempest-TestNetworkBasicOps-server-1184505323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1184505323',id=11,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKg+pcfICyoy0PdUh7IAeh34OcMnvlgfDaE12t4Y/9/QjK5SA5Wnh43QNi8YWI2ahCwmMnrIQ65ramNfTIFEmyqXTFXzbnkjufkp+4+PrWJhxjICuVaXjNX4nlafiL43BQ==',key_name='tempest-TestNetworkBasicOps-1133988200',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-bkbw0tu9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:02:25Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=4934aff6-60d5-416e-97b3-bed2dbe82055,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.284 187518 DEBUG nova.network.os_vif_util [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.285 187518 DEBUG nova.network.os_vif_util [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:25:7c,bridge_name='br-int',has_traffic_filtering=True,id=4b2bcfa2-03bd-4475-b868-15c0531e30d4,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2bcfa2-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.285 187518 DEBUG os_vif [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:25:7c,bridge_name='br-int',has_traffic_filtering=True,id=4b2bcfa2-03bd-4475-b868-15c0531e30d4,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2bcfa2-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.286 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.286 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.287 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.290 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.291 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b2bcfa2-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.291 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b2bcfa2-03, col_values=(('external_ids', {'iface-id': '4b2bcfa2-03bd-4475-b868-15c0531e30d4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:25:7c', 'vm-uuid': '4934aff6-60d5-416e-97b3-bed2dbe82055'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.293 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:32 np0005539279 NetworkManager[55703]: <info>  [1764378152.2959] manager: (tap4b2bcfa2-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.297 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.303 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.304 187518 INFO os_vif [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:25:7c,bridge_name='br-int',has_traffic_filtering=True,id=4b2bcfa2-03bd-4475-b868-15c0531e30d4,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2bcfa2-03')#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.364 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.365 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.365 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:8c:25:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.366 187518 INFO nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Using config drive#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.604 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.623 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.727 187518 INFO nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Creating config drive at /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk.config#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.735 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp15p3gues execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.872 187518 DEBUG oslo_concurrency.processutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp15p3gues" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:32 np0005539279 NetworkManager[55703]: <info>  [1764378152.9380] manager: (tap4b2bcfa2-03): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Nov 28 20:02:32 np0005539279 kernel: tap4b2bcfa2-03: entered promiscuous mode
Nov 28 20:02:32 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:32Z|00146|binding|INFO|Claiming lport 4b2bcfa2-03bd-4475-b868-15c0531e30d4 for this chassis.
Nov 28 20:02:32 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:32Z|00147|binding|INFO|4b2bcfa2-03bd-4475-b868-15c0531e30d4: Claiming fa:16:3e:8c:25:7c 10.100.0.14
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.940 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:32 np0005539279 nova_compute[187514]: 2025-11-29 01:02:32.953 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.960 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:25:7c 10.100.0.14'], port_security=['fa:16:3e:8c:25:7c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '4934aff6-60d5-416e-97b3-bed2dbe82055', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': '620dc653-8302-473f-918a-b571669e756e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=135d424c-826f-4d0f-bd1e-5a6354f7c71b, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=4b2bcfa2-03bd-4475-b868-15c0531e30d4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.961 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 4b2bcfa2-03bd-4475-b868-15c0531e30d4 in datapath f68e6d0c-8d35-4ac7-800e-d8d5def4a774 bound to our chassis#033[00m
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.962 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f68e6d0c-8d35-4ac7-800e-d8d5def4a774#033[00m
Nov 28 20:02:32 np0005539279 systemd-udevd[219201]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.974 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[07d6a5e8-8d2d-4aea-8d84-36b3e552414f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.975 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf68e6d0c-81 in ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.976 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf68e6d0c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.976 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[ef11d812-5466-4e9c-a60c-e21eddd5ebfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.977 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[47a05ce2-3f41-491f-babb-243498740ef1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:32 np0005539279 NetworkManager[55703]: <info>  [1764378152.9861] device (tap4b2bcfa2-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 20:02:32 np0005539279 NetworkManager[55703]: <info>  [1764378152.9870] device (tap4b2bcfa2-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 20:02:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:32.987 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[1f41e61c-ce9c-4ab2-afb9-6a145c1b5194]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:32 np0005539279 systemd-machined[153752]: New machine qemu-11-instance-0000000b.
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.020 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e345f983-3262-4a67-ab9d-e90bf26cccbf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Nov 28 20:02:33 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:33Z|00148|binding|INFO|Setting lport 4b2bcfa2-03bd-4475-b868-15c0531e30d4 ovn-installed in OVS
Nov 28 20:02:33 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:33Z|00149|binding|INFO|Setting lport 4b2bcfa2-03bd-4475-b868-15c0531e30d4 up in Southbound
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.033 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.062 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[f91bdce0-329f-4bf3-90f4-954faffd5e00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 NetworkManager[55703]: <info>  [1764378153.0692] manager: (tapf68e6d0c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.068 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ca5768-e249-4a9a-8ff6-50a81f51c7ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.109 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[6b5a9c4c-ae02-4041-b7ca-23c36b92e110]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.113 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0f8086-1268-4b6a-a37c-9adebca41da9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 NetworkManager[55703]: <info>  [1764378153.1480] device (tapf68e6d0c-80): carrier: link connected
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.157 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee917cd-76fa-4107-8532-8850e92998c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.180 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e6c5af57-0792-4a60-a7ee-8ff64abbd713]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf68e6d0c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:b9:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408685, 'reachable_time': 23801, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219236, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.204 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b93852f7-61a1-4068-a4cb-80d38c386f3d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe19:b9f2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408685, 'tstamp': 408685}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219237, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.230 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[1638af0f-7d21-4600-ba83-296c25ed0307]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf68e6d0c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:b9:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408685, 'reachable_time': 23801, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219238, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.273 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[d2eebcb9-bcd9-4c3d-a884-580ec72734c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.354 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[b929ca22-62be-4ac2-8bc3-16fb97fa37db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.356 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf68e6d0c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.356 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.357 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf68e6d0c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.359 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:33 np0005539279 NetworkManager[55703]: <info>  [1764378153.3605] manager: (tapf68e6d0c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 28 20:02:33 np0005539279 kernel: tapf68e6d0c-80: entered promiscuous mode
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.362 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.365 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf68e6d0c-80, col_values=(('external_ids', {'iface-id': '3b1483fb-3e2e-4ef3-97c1-8cc0ddc07ca1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:33 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:33Z|00150|binding|INFO|Releasing lport 3b1483fb-3e2e-4ef3-97c1-8cc0ddc07ca1 from this chassis (sb_readonly=0)
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.366 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.388 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.391 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f68e6d0c-8d35-4ac7-800e-d8d5def4a774.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f68e6d0c-8d35-4ac7-800e-d8d5def4a774.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.392 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f9c427-2bae-4dbe-91d1-091a3d69173c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.393 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-f68e6d0c-8d35-4ac7-800e-d8d5def4a774
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/f68e6d0c-8d35-4ac7-800e-d8d5def4a774.pid.haproxy
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID f68e6d0c-8d35-4ac7-800e-d8d5def4a774
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 20:02:33 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:33.394 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'env', 'PROCESS_TAG=haproxy-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f68e6d0c-8d35-4ac7-800e-d8d5def4a774.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.502 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378153.5021994, 4934aff6-60d5-416e-97b3-bed2dbe82055 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.503 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] VM Started (Lifecycle Event)#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.535 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.539 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378153.502303, 4934aff6-60d5-416e-97b3-bed2dbe82055 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.540 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] VM Paused (Lifecycle Event)#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.563 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.567 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.598 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.611 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.642 187518 DEBUG nova.compute.manager [req-8c0d9c9d-4433-4631-8888-c82a10438881 req-41b69fd3-7df1-40f3-bce0-f2f97ab379ce 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.643 187518 DEBUG oslo_concurrency.lockutils [req-8c0d9c9d-4433-4631-8888-c82a10438881 req-41b69fd3-7df1-40f3-bce0-f2f97ab379ce 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.643 187518 DEBUG oslo_concurrency.lockutils [req-8c0d9c9d-4433-4631-8888-c82a10438881 req-41b69fd3-7df1-40f3-bce0-f2f97ab379ce 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.644 187518 DEBUG oslo_concurrency.lockutils [req-8c0d9c9d-4433-4631-8888-c82a10438881 req-41b69fd3-7df1-40f3-bce0-f2f97ab379ce 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.645 187518 DEBUG nova.compute.manager [req-8c0d9c9d-4433-4631-8888-c82a10438881 req-41b69fd3-7df1-40f3-bce0-f2f97ab379ce 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Processing event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.646 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.654 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378153.6539505, 4934aff6-60d5-416e-97b3-bed2dbe82055 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.654 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] VM Resumed (Lifecycle Event)#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.655 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.658 187518 INFO nova.virt.libvirt.driver [-] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Instance spawned successfully.#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.658 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.695 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.701 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.705 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.705 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.706 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.706 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.706 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.707 187518 DEBUG nova.virt.libvirt.driver [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.741 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.773 187518 INFO nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Took 8.38 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.774 187518 DEBUG nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.837 187518 INFO nova.compute.manager [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Took 8.85 seconds to build instance.#033[00m
Nov 28 20:02:33 np0005539279 nova_compute[187514]: 2025-11-29 01:02:33.852 187518 DEBUG oslo_concurrency.lockutils [None req-85a81d45-8876-4231-8e6f-a744d69ccbff 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.954s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:33 np0005539279 podman[219277]: 2025-11-29 01:02:33.85625051 +0000 UTC m=+0.072235679 container create 83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 20:02:33 np0005539279 systemd[1]: Started libpod-conmon-83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe.scope.
Nov 28 20:02:33 np0005539279 podman[219277]: 2025-11-29 01:02:33.809311277 +0000 UTC m=+0.025296446 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 20:02:33 np0005539279 systemd[1]: Started libcrun container.
Nov 28 20:02:33 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d52db0c140c40629cb7522c7c735f9c837b2409735c59c793b6741a0bc7ef39/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 20:02:33 np0005539279 podman[219277]: 2025-11-29 01:02:33.951424313 +0000 UTC m=+0.167409512 container init 83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:02:33 np0005539279 podman[219277]: 2025-11-29 01:02:33.956298865 +0000 UTC m=+0.172284024 container start 83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:02:33 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 20:02:33 np0005539279 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 20:02:33 np0005539279 neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774[219292]: [NOTICE]   (219296) : New worker (219299) forked
Nov 28 20:02:33 np0005539279 neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774[219292]: [NOTICE]   (219296) : Loading success.
Nov 28 20:02:34 np0005539279 nova_compute[187514]: 2025-11-29 01:02:34.790 187518 DEBUG nova.network.neutron [req-6f55512b-ad7a-43d0-a2c4-3c0105af7b49 req-8588c3fa-a5a8-4a90-be1a-a074795b443a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updated VIF entry in instance network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:02:34 np0005539279 nova_compute[187514]: 2025-11-29 01:02:34.792 187518 DEBUG nova.network.neutron [req-6f55512b-ad7a-43d0-a2c4-3c0105af7b49 req-8588c3fa-a5a8-4a90-be1a-a074795b443a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updating instance_info_cache with network_info: [{"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:02:34 np0005539279 nova_compute[187514]: 2025-11-29 01:02:34.811 187518 DEBUG oslo_concurrency.lockutils [req-6f55512b-ad7a-43d0-a2c4-3c0105af7b49 req-8588c3fa-a5a8-4a90-be1a-a074795b443a 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:02:35 np0005539279 nova_compute[187514]: 2025-11-29 01:02:35.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:02:35 np0005539279 nova_compute[187514]: 2025-11-29 01:02:35.744 187518 DEBUG nova.compute.manager [req-0ce662ec-6879-4ab2-bfdb-92fa23fe0a29 req-60906142-04fc-4a7d-b485-bbb26cfd344d 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:35 np0005539279 nova_compute[187514]: 2025-11-29 01:02:35.745 187518 DEBUG oslo_concurrency.lockutils [req-0ce662ec-6879-4ab2-bfdb-92fa23fe0a29 req-60906142-04fc-4a7d-b485-bbb26cfd344d 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:35 np0005539279 nova_compute[187514]: 2025-11-29 01:02:35.746 187518 DEBUG oslo_concurrency.lockutils [req-0ce662ec-6879-4ab2-bfdb-92fa23fe0a29 req-60906142-04fc-4a7d-b485-bbb26cfd344d 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:35 np0005539279 nova_compute[187514]: 2025-11-29 01:02:35.747 187518 DEBUG oslo_concurrency.lockutils [req-0ce662ec-6879-4ab2-bfdb-92fa23fe0a29 req-60906142-04fc-4a7d-b485-bbb26cfd344d 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:35 np0005539279 nova_compute[187514]: 2025-11-29 01:02:35.747 187518 DEBUG nova.compute.manager [req-0ce662ec-6879-4ab2-bfdb-92fa23fe0a29 req-60906142-04fc-4a7d-b485-bbb26cfd344d 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] No waiting events found dispatching network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:02:35 np0005539279 nova_compute[187514]: 2025-11-29 01:02:35.748 187518 WARNING nova.compute.manager [req-0ce662ec-6879-4ab2-bfdb-92fa23fe0a29 req-60906142-04fc-4a7d-b485-bbb26cfd344d 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received unexpected event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:02:36 np0005539279 podman[219309]: 2025-11-29 01:02:36.848985334 +0000 UTC m=+0.085790352 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 20:02:36 np0005539279 podman[219308]: 2025-11-29 01:02:36.892701783 +0000 UTC m=+0.130726097 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 20:02:37 np0005539279 nova_compute[187514]: 2025-11-29 01:02:37.294 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:37 np0005539279 nova_compute[187514]: 2025-11-29 01:02:37.626 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:38 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:38Z|00151|binding|INFO|Releasing lport 3b1483fb-3e2e-4ef3-97c1-8cc0ddc07ca1 from this chassis (sb_readonly=0)
Nov 28 20:02:38 np0005539279 nova_compute[187514]: 2025-11-29 01:02:38.923 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:38 np0005539279 NetworkManager[55703]: <info>  [1764378158.9264] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Nov 28 20:02:38 np0005539279 NetworkManager[55703]: <info>  [1764378158.9283] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 28 20:02:38 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:38Z|00152|binding|INFO|Releasing lport 3b1483fb-3e2e-4ef3-97c1-8cc0ddc07ca1 from this chassis (sb_readonly=0)
Nov 28 20:02:38 np0005539279 nova_compute[187514]: 2025-11-29 01:02:38.980 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:38 np0005539279 nova_compute[187514]: 2025-11-29 01:02:38.989 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:39 np0005539279 nova_compute[187514]: 2025-11-29 01:02:39.191 187518 DEBUG nova.compute.manager [req-408323e2-4ed1-4282-bde9-cc2e52c8cdd4 req-ed9f8fc7-cb51-4fdc-9c65-a8b41815c615 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:39 np0005539279 nova_compute[187514]: 2025-11-29 01:02:39.192 187518 DEBUG nova.compute.manager [req-408323e2-4ed1-4282-bde9-cc2e52c8cdd4 req-ed9f8fc7-cb51-4fdc-9c65-a8b41815c615 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing instance network info cache due to event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:02:39 np0005539279 nova_compute[187514]: 2025-11-29 01:02:39.193 187518 DEBUG oslo_concurrency.lockutils [req-408323e2-4ed1-4282-bde9-cc2e52c8cdd4 req-ed9f8fc7-cb51-4fdc-9c65-a8b41815c615 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:02:39 np0005539279 nova_compute[187514]: 2025-11-29 01:02:39.193 187518 DEBUG oslo_concurrency.lockutils [req-408323e2-4ed1-4282-bde9-cc2e52c8cdd4 req-ed9f8fc7-cb51-4fdc-9c65-a8b41815c615 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:02:39 np0005539279 nova_compute[187514]: 2025-11-29 01:02:39.194 187518 DEBUG nova.network.neutron [req-408323e2-4ed1-4282-bde9-cc2e52c8cdd4 req-ed9f8fc7-cb51-4fdc-9c65-a8b41815c615 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:02:40 np0005539279 nova_compute[187514]: 2025-11-29 01:02:40.616 187518 DEBUG nova.network.neutron [req-408323e2-4ed1-4282-bde9-cc2e52c8cdd4 req-ed9f8fc7-cb51-4fdc-9c65-a8b41815c615 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updated VIF entry in instance network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:02:40 np0005539279 nova_compute[187514]: 2025-11-29 01:02:40.617 187518 DEBUG nova.network.neutron [req-408323e2-4ed1-4282-bde9-cc2e52c8cdd4 req-ed9f8fc7-cb51-4fdc-9c65-a8b41815c615 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updating instance_info_cache with network_info: [{"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:02:40 np0005539279 nova_compute[187514]: 2025-11-29 01:02:40.644 187518 DEBUG oslo_concurrency.lockutils [req-408323e2-4ed1-4282-bde9-cc2e52c8cdd4 req-ed9f8fc7-cb51-4fdc-9c65-a8b41815c615 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:02:42 np0005539279 nova_compute[187514]: 2025-11-29 01:02:42.331 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:42 np0005539279 nova_compute[187514]: 2025-11-29 01:02:42.628 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:44 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:44Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8c:25:7c 10.100.0.14
Nov 28 20:02:44 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:44Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:25:7c 10.100.0.14
Nov 28 20:02:44 np0005539279 nova_compute[187514]: 2025-11-29 01:02:44.776 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "1f1e4b57-b962-46c9-a111-b97078141733" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:44 np0005539279 nova_compute[187514]: 2025-11-29 01:02:44.778 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:44 np0005539279 nova_compute[187514]: 2025-11-29 01:02:44.797 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 20:02:44 np0005539279 nova_compute[187514]: 2025-11-29 01:02:44.910 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:44 np0005539279 nova_compute[187514]: 2025-11-29 01:02:44.911 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:44 np0005539279 nova_compute[187514]: 2025-11-29 01:02:44.921 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 20:02:44 np0005539279 nova_compute[187514]: 2025-11-29 01:02:44.922 187518 INFO nova.compute.claims [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.065 187518 DEBUG nova.compute.provider_tree [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.081 187518 DEBUG nova.scheduler.client.report [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.107 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.108 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.151 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.151 187518 DEBUG nova.network.neutron [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.166 187518 INFO nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.184 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.287 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.288 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.289 187518 INFO nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Creating image(s)#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.290 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.291 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.292 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.315 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.404 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.406 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.407 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.429 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.473 187518 DEBUG nova.policy [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.496 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.497 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.538 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.540 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.541 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.612 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.613 187518 DEBUG nova.virt.disk.api [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.614 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.672 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.674 187518 DEBUG nova.virt.disk.api [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.674 187518 DEBUG nova.objects.instance [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f1e4b57-b962-46c9-a111-b97078141733 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.693 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.693 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Ensure instance console log exists: /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.694 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.694 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:45 np0005539279 nova_compute[187514]: 2025-11-29 01:02:45.695 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:46 np0005539279 nova_compute[187514]: 2025-11-29 01:02:46.100 187518 DEBUG nova.network.neutron [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Successfully created port: 3a771fda-08d7-42a6-b958-171092527357 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 20:02:46 np0005539279 nova_compute[187514]: 2025-11-29 01:02:46.891 187518 DEBUG nova.network.neutron [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Successfully updated port: 3a771fda-08d7-42a6-b958-171092527357 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 20:02:46 np0005539279 nova_compute[187514]: 2025-11-29 01:02:46.910 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:02:46 np0005539279 nova_compute[187514]: 2025-11-29 01:02:46.910 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:02:46 np0005539279 nova_compute[187514]: 2025-11-29 01:02:46.911 187518 DEBUG nova.network.neutron [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 20:02:47 np0005539279 nova_compute[187514]: 2025-11-29 01:02:47.015 187518 DEBUG nova.compute.manager [req-f2a2f81e-aac0-42fa-883e-397085cfce56 req-85a80e19-55b3-4736-8960-c02853878d4b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-changed-3a771fda-08d7-42a6-b958-171092527357 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:47 np0005539279 nova_compute[187514]: 2025-11-29 01:02:47.016 187518 DEBUG nova.compute.manager [req-f2a2f81e-aac0-42fa-883e-397085cfce56 req-85a80e19-55b3-4736-8960-c02853878d4b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Refreshing instance network info cache due to event network-changed-3a771fda-08d7-42a6-b958-171092527357. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:02:47 np0005539279 nova_compute[187514]: 2025-11-29 01:02:47.016 187518 DEBUG oslo_concurrency.lockutils [req-f2a2f81e-aac0-42fa-883e-397085cfce56 req-85a80e19-55b3-4736-8960-c02853878d4b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:02:47 np0005539279 nova_compute[187514]: 2025-11-29 01:02:47.092 187518 DEBUG nova.network.neutron [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 20:02:47 np0005539279 nova_compute[187514]: 2025-11-29 01:02:47.337 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:47 np0005539279 nova_compute[187514]: 2025-11-29 01:02:47.678 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:47 np0005539279 nova_compute[187514]: 2025-11-29 01:02:47.985 187518 DEBUG nova.network.neutron [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Updating instance_info_cache with network_info: [{"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.020 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.021 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Instance network_info: |[{"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.021 187518 DEBUG oslo_concurrency.lockutils [req-f2a2f81e-aac0-42fa-883e-397085cfce56 req-85a80e19-55b3-4736-8960-c02853878d4b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.022 187518 DEBUG nova.network.neutron [req-f2a2f81e-aac0-42fa-883e-397085cfce56 req-85a80e19-55b3-4736-8960-c02853878d4b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Refreshing network info cache for port 3a771fda-08d7-42a6-b958-171092527357 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.027 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Start _get_guest_xml network_info=[{"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.036 187518 WARNING nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.049 187518 DEBUG nova.virt.libvirt.host [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.051 187518 DEBUG nova.virt.libvirt.host [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.056 187518 DEBUG nova.virt.libvirt.host [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.057 187518 DEBUG nova.virt.libvirt.host [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.057 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.058 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.059 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.059 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.060 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.060 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.061 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.061 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.062 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.062 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.062 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.063 187518 DEBUG nova.virt.hardware [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.071 187518 DEBUG nova.virt.libvirt.vif [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:02:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1767310003',display_name='tempest-TestNetworkBasicOps-server-1767310003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1767310003',id=12,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIXY4rDSKA0ybeOn3KCfbAKsWL18lhNQOnDuTytFCUa2G12iUxsE8uFWJRWXPHPlQQFcCmrv2mTrqLXqVy76CFkTaDyN5CkBoo5uXSeYWqICjabH1zc21hLQ9G9uXim5Cw==',key_name='tempest-TestNetworkBasicOps-346262234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-kca1lrf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:02:45Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=1f1e4b57-b962-46c9-a111-b97078141733,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.072 187518 DEBUG nova.network.os_vif_util [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.074 187518 DEBUG nova.network.os_vif_util [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:5f:00,bridge_name='br-int',has_traffic_filtering=True,id=3a771fda-08d7-42a6-b958-171092527357,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a771fda-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.075 187518 DEBUG nova.objects.instance [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f1e4b57-b962-46c9-a111-b97078141733 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.094 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] End _get_guest_xml xml=<domain type="kvm">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <uuid>1f1e4b57-b962-46c9-a111-b97078141733</uuid>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <name>instance-0000000c</name>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-1767310003</nova:name>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 01:02:48</nova:creationTime>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        <nova:port uuid="3a771fda-08d7-42a6-b958-171092527357">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <entry name="serial">1f1e4b57-b962-46c9-a111-b97078141733</entry>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <entry name="uuid">1f1e4b57-b962-46c9-a111-b97078141733</entry>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk.config"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:5f:5f:00"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <target dev="tap3a771fda-08"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/console.log" append="off"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 20:02:48 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:02:48 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:02:48 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:02:48 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.097 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Preparing to wait for external event network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.097 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "1f1e4b57-b962-46c9-a111-b97078141733-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.098 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.098 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.100 187518 DEBUG nova.virt.libvirt.vif [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:02:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1767310003',display_name='tempest-TestNetworkBasicOps-server-1767310003',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1767310003',id=12,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIXY4rDSKA0ybeOn3KCfbAKsWL18lhNQOnDuTytFCUa2G12iUxsE8uFWJRWXPHPlQQFcCmrv2mTrqLXqVy76CFkTaDyN5CkBoo5uXSeYWqICjabH1zc21hLQ9G9uXim5Cw==',key_name='tempest-TestNetworkBasicOps-346262234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-kca1lrf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:02:45Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=1f1e4b57-b962-46c9-a111-b97078141733,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.100 187518 DEBUG nova.network.os_vif_util [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.101 187518 DEBUG nova.network.os_vif_util [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:5f:00,bridge_name='br-int',has_traffic_filtering=True,id=3a771fda-08d7-42a6-b958-171092527357,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a771fda-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.102 187518 DEBUG os_vif [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:5f:00,bridge_name='br-int',has_traffic_filtering=True,id=3a771fda-08d7-42a6-b958-171092527357,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a771fda-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.103 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.104 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.105 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.110 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.110 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3a771fda-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.111 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3a771fda-08, col_values=(('external_ids', {'iface-id': '3a771fda-08d7-42a6-b958-171092527357', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:5f:00', 'vm-uuid': '1f1e4b57-b962-46c9-a111-b97078141733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.113 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:48 np0005539279 NetworkManager[55703]: <info>  [1764378168.1147] manager: (tap3a771fda-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.116 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.124 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.127 187518 INFO os_vif [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:5f:00,bridge_name='br-int',has_traffic_filtering=True,id=3a771fda-08d7-42a6-b958-171092527357,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a771fda-08')#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.213 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.214 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.215 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:5f:5f:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.217 187518 INFO nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Using config drive#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.597 187518 INFO nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Creating config drive at /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk.config#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.607 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptxi16vod execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.746 187518 DEBUG oslo_concurrency.processutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptxi16vod" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:02:48 np0005539279 kernel: tap3a771fda-08: entered promiscuous mode
Nov 28 20:02:48 np0005539279 NetworkManager[55703]: <info>  [1764378168.8268] manager: (tap3a771fda-08): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 28 20:02:48 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:48Z|00153|binding|INFO|Claiming lport 3a771fda-08d7-42a6-b958-171092527357 for this chassis.
Nov 28 20:02:48 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:48Z|00154|binding|INFO|3a771fda-08d7-42a6-b958-171092527357: Claiming fa:16:3e:5f:5f:00 10.100.0.6
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.828 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:48.839 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:5f:00 10.100.0.6'], port_security=['fa:16:3e:5f:5f:00 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1f1e4b57-b962-46c9-a111-b97078141733', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': '849bd5a6-43c5-489b-8d1f-770fa48eaff6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=135d424c-826f-4d0f-bd1e-5a6354f7c71b, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=3a771fda-08d7-42a6-b958-171092527357) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:02:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:48.842 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 3a771fda-08d7-42a6-b958-171092527357 in datapath f68e6d0c-8d35-4ac7-800e-d8d5def4a774 bound to our chassis#033[00m
Nov 28 20:02:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:48.844 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f68e6d0c-8d35-4ac7-800e-d8d5def4a774#033[00m
Nov 28 20:02:48 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:48Z|00155|binding|INFO|Setting lport 3a771fda-08d7-42a6-b958-171092527357 ovn-installed in OVS
Nov 28 20:02:48 np0005539279 ovn_controller[95686]: 2025-11-29T01:02:48Z|00156|binding|INFO|Setting lport 3a771fda-08d7-42a6-b958-171092527357 up in Southbound
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.860 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:48 np0005539279 nova_compute[187514]: 2025-11-29 01:02:48.866 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:48.870 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[141728cc-df35-4691-920a-cf390d9cec41]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:48 np0005539279 systemd-udevd[219406]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 20:02:48 np0005539279 systemd-machined[153752]: New machine qemu-12-instance-0000000c.
Nov 28 20:02:48 np0005539279 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Nov 28 20:02:48 np0005539279 NetworkManager[55703]: <info>  [1764378168.9100] device (tap3a771fda-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 20:02:48 np0005539279 NetworkManager[55703]: <info>  [1764378168.9129] device (tap3a771fda-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 20:02:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:48.911 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[2430fff4-0fd7-47bd-8f81-323b08c72514]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:48.916 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c33ebc-4470-4cb2-85e3-9a65161bb1c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:48.960 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[fe06230b-cf81-468f-a0ae-fa6ba0e20587]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:48 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:48.990 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2935b78a-74d1-4847-a11b-ad6711a5cc1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf68e6d0c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:b9:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408685, 'reachable_time': 35944, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219416, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:49 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:49.011 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc15783-c877-44a6-b86c-a4b0d0319fa0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf68e6d0c-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408701, 'tstamp': 408701}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219419, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf68e6d0c-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408705, 'tstamp': 408705}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219419, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:02:49 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:49.013 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf68e6d0c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.015 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:49 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:49.017 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf68e6d0c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:49 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:49.017 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:02:49 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:49.018 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf68e6d0c-80, col_values=(('external_ids', {'iface-id': '3b1483fb-3e2e-4ef3-97c1-8cc0ddc07ca1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:02:49 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:02:49.019 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.130 187518 DEBUG nova.compute.manager [req-97353c28-94d1-4042-9062-2f348aaa3fda req-75de564a-9ecd-4cb4-8902-b48fc54c8fcc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.131 187518 DEBUG oslo_concurrency.lockutils [req-97353c28-94d1-4042-9062-2f348aaa3fda req-75de564a-9ecd-4cb4-8902-b48fc54c8fcc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "1f1e4b57-b962-46c9-a111-b97078141733-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.132 187518 DEBUG oslo_concurrency.lockutils [req-97353c28-94d1-4042-9062-2f348aaa3fda req-75de564a-9ecd-4cb4-8902-b48fc54c8fcc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.132 187518 DEBUG oslo_concurrency.lockutils [req-97353c28-94d1-4042-9062-2f348aaa3fda req-75de564a-9ecd-4cb4-8902-b48fc54c8fcc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.132 187518 DEBUG nova.compute.manager [req-97353c28-94d1-4042-9062-2f348aaa3fda req-75de564a-9ecd-4cb4-8902-b48fc54c8fcc 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Processing event network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.302 187518 DEBUG nova.network.neutron [req-f2a2f81e-aac0-42fa-883e-397085cfce56 req-85a80e19-55b3-4736-8960-c02853878d4b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Updated VIF entry in instance network info cache for port 3a771fda-08d7-42a6-b958-171092527357. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.303 187518 DEBUG nova.network.neutron [req-f2a2f81e-aac0-42fa-883e-397085cfce56 req-85a80e19-55b3-4736-8960-c02853878d4b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Updating instance_info_cache with network_info: [{"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.317 187518 DEBUG oslo_concurrency.lockutils [req-f2a2f81e-aac0-42fa-883e-397085cfce56 req-85a80e19-55b3-4736-8960-c02853878d4b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.580 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378169.5800204, 1f1e4b57-b962-46c9-a111-b97078141733 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.581 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] VM Started (Lifecycle Event)#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.585 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.591 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.596 187518 INFO nova.virt.libvirt.driver [-] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Instance spawned successfully.#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.597 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.600 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.606 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.620 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.621 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.621 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.622 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.622 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.623 187518 DEBUG nova.virt.libvirt.driver [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.629 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.629 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378169.5804048, 1f1e4b57-b962-46c9-a111-b97078141733 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.629 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] VM Paused (Lifecycle Event)#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.675 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.680 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378169.5891032, 1f1e4b57-b962-46c9-a111-b97078141733 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.681 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] VM Resumed (Lifecycle Event)#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.686 187518 INFO nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Took 4.40 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.686 187518 DEBUG nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.698 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.701 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.721 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.746 187518 INFO nova.compute.manager [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Took 4.90 seconds to build instance.#033[00m
Nov 28 20:02:49 np0005539279 nova_compute[187514]: 2025-11-29 01:02:49.760 187518 DEBUG oslo_concurrency.lockutils [None req-a82b8666-0b31-49df-9df8-3e0885f3446e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.983s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:50 np0005539279 podman[219430]: 2025-11-29 01:02:50.847849312 +0000 UTC m=+0.082881667 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 20:02:50 np0005539279 podman[219431]: 2025-11-29 01:02:50.860012675 +0000 UTC m=+0.086062089 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:02:50 np0005539279 podman[219432]: 2025-11-29 01:02:50.869900502 +0000 UTC m=+0.089038406 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 20:02:51 np0005539279 nova_compute[187514]: 2025-11-29 01:02:51.257 187518 DEBUG nova.compute.manager [req-a5aec63b-0eef-43c6-886e-6b351f4e10a1 req-aee26b5f-5888-4851-a266-25b107be8bf6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:51 np0005539279 nova_compute[187514]: 2025-11-29 01:02:51.257 187518 DEBUG oslo_concurrency.lockutils [req-a5aec63b-0eef-43c6-886e-6b351f4e10a1 req-aee26b5f-5888-4851-a266-25b107be8bf6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "1f1e4b57-b962-46c9-a111-b97078141733-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:02:51 np0005539279 nova_compute[187514]: 2025-11-29 01:02:51.258 187518 DEBUG oslo_concurrency.lockutils [req-a5aec63b-0eef-43c6-886e-6b351f4e10a1 req-aee26b5f-5888-4851-a266-25b107be8bf6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:02:51 np0005539279 nova_compute[187514]: 2025-11-29 01:02:51.258 187518 DEBUG oslo_concurrency.lockutils [req-a5aec63b-0eef-43c6-886e-6b351f4e10a1 req-aee26b5f-5888-4851-a266-25b107be8bf6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:02:51 np0005539279 nova_compute[187514]: 2025-11-29 01:02:51.258 187518 DEBUG nova.compute.manager [req-a5aec63b-0eef-43c6-886e-6b351f4e10a1 req-aee26b5f-5888-4851-a266-25b107be8bf6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] No waiting events found dispatching network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:02:51 np0005539279 nova_compute[187514]: 2025-11-29 01:02:51.259 187518 WARNING nova.compute.manager [req-a5aec63b-0eef-43c6-886e-6b351f4e10a1 req-aee26b5f-5888-4851-a266-25b107be8bf6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received unexpected event network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:02:52 np0005539279 nova_compute[187514]: 2025-11-29 01:02:52.724 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:53 np0005539279 nova_compute[187514]: 2025-11-29 01:02:53.113 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:53 np0005539279 nova_compute[187514]: 2025-11-29 01:02:53.652 187518 DEBUG nova.compute.manager [req-f8f140ef-ba1a-4451-824c-6fc5a06189ac req-c0626207-1085-4352-a280-1eed30c58a2f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-changed-3a771fda-08d7-42a6-b958-171092527357 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:02:53 np0005539279 nova_compute[187514]: 2025-11-29 01:02:53.653 187518 DEBUG nova.compute.manager [req-f8f140ef-ba1a-4451-824c-6fc5a06189ac req-c0626207-1085-4352-a280-1eed30c58a2f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Refreshing instance network info cache due to event network-changed-3a771fda-08d7-42a6-b958-171092527357. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:02:53 np0005539279 nova_compute[187514]: 2025-11-29 01:02:53.653 187518 DEBUG oslo_concurrency.lockutils [req-f8f140ef-ba1a-4451-824c-6fc5a06189ac req-c0626207-1085-4352-a280-1eed30c58a2f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:02:53 np0005539279 nova_compute[187514]: 2025-11-29 01:02:53.654 187518 DEBUG oslo_concurrency.lockutils [req-f8f140ef-ba1a-4451-824c-6fc5a06189ac req-c0626207-1085-4352-a280-1eed30c58a2f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:02:53 np0005539279 nova_compute[187514]: 2025-11-29 01:02:53.654 187518 DEBUG nova.network.neutron [req-f8f140ef-ba1a-4451-824c-6fc5a06189ac req-c0626207-1085-4352-a280-1eed30c58a2f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Refreshing network info cache for port 3a771fda-08d7-42a6-b958-171092527357 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:02:55 np0005539279 nova_compute[187514]: 2025-11-29 01:02:55.612 187518 DEBUG nova.network.neutron [req-f8f140ef-ba1a-4451-824c-6fc5a06189ac req-c0626207-1085-4352-a280-1eed30c58a2f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Updated VIF entry in instance network info cache for port 3a771fda-08d7-42a6-b958-171092527357. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:02:55 np0005539279 nova_compute[187514]: 2025-11-29 01:02:55.613 187518 DEBUG nova.network.neutron [req-f8f140ef-ba1a-4451-824c-6fc5a06189ac req-c0626207-1085-4352-a280-1eed30c58a2f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Updating instance_info_cache with network_info: [{"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:02:55 np0005539279 nova_compute[187514]: 2025-11-29 01:02:55.640 187518 DEBUG oslo_concurrency.lockutils [req-f8f140ef-ba1a-4451-824c-6fc5a06189ac req-c0626207-1085-4352-a280-1eed30c58a2f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:02:57 np0005539279 nova_compute[187514]: 2025-11-29 01:02:57.754 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:02:58 np0005539279 nova_compute[187514]: 2025-11-29 01:02:58.115 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:01 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:01Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:5f:00 10.100.0.6
Nov 28 20:03:01 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:01Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:5f:00 10.100.0.6
Nov 28 20:03:02 np0005539279 nova_compute[187514]: 2025-11-29 01:03:02.803 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:02 np0005539279 podman[219522]: 2025-11-29 01:03:02.862098035 +0000 UTC m=+0.087970635 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 20:03:02 np0005539279 podman[219521]: 2025-11-29 01:03:02.871469167 +0000 UTC m=+0.101541079 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 20:03:03 np0005539279 nova_compute[187514]: 2025-11-29 01:03:03.118 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:07 np0005539279 nova_compute[187514]: 2025-11-29 01:03:07.848 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:07 np0005539279 podman[219569]: 2025-11-29 01:03:07.874375707 +0000 UTC m=+0.108664176 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 20:03:07 np0005539279 podman[219568]: 2025-11-29 01:03:07.939182759 +0000 UTC m=+0.170231654 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 28 20:03:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:08.097 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:08.097 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:08.098 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:08 np0005539279 nova_compute[187514]: 2025-11-29 01:03:08.121 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:09 np0005539279 nova_compute[187514]: 2025-11-29 01:03:09.683 187518 INFO nova.compute.manager [None req-b7c4fe29-196e-461e-9c1d-658dabfe50d1 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Get console output#033[00m
Nov 28 20:03:09 np0005539279 nova_compute[187514]: 2025-11-29 01:03:09.693 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 20:03:10 np0005539279 nova_compute[187514]: 2025-11-29 01:03:10.690 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:10 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:10.691 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:03:10 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:10.694 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 20:03:10 np0005539279 nova_compute[187514]: 2025-11-29 01:03:10.862 187518 DEBUG nova.compute.manager [req-e113ff35-12e6-4b54-b587-9eea4fbab60b req-5259a8b9-4de9-4cd8-9b01-9490fb97978b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:10 np0005539279 nova_compute[187514]: 2025-11-29 01:03:10.863 187518 DEBUG nova.compute.manager [req-e113ff35-12e6-4b54-b587-9eea4fbab60b req-5259a8b9-4de9-4cd8-9b01-9490fb97978b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing instance network info cache due to event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:03:10 np0005539279 nova_compute[187514]: 2025-11-29 01:03:10.863 187518 DEBUG oslo_concurrency.lockutils [req-e113ff35-12e6-4b54-b587-9eea4fbab60b req-5259a8b9-4de9-4cd8-9b01-9490fb97978b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:03:10 np0005539279 nova_compute[187514]: 2025-11-29 01:03:10.863 187518 DEBUG oslo_concurrency.lockutils [req-e113ff35-12e6-4b54-b587-9eea4fbab60b req-5259a8b9-4de9-4cd8-9b01-9490fb97978b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:03:10 np0005539279 nova_compute[187514]: 2025-11-29 01:03:10.864 187518 DEBUG nova.network.neutron [req-e113ff35-12e6-4b54-b587-9eea4fbab60b req-5259a8b9-4de9-4cd8-9b01-9490fb97978b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:03:11 np0005539279 nova_compute[187514]: 2025-11-29 01:03:11.896 187518 INFO nova.compute.manager [None req-b7b90f3c-68ca-48eb-8e4c-4ab45785529e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Get console output#033[00m
Nov 28 20:03:11 np0005539279 nova_compute[187514]: 2025-11-29 01:03:11.903 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.392 187518 DEBUG nova.network.neutron [req-e113ff35-12e6-4b54-b587-9eea4fbab60b req-5259a8b9-4de9-4cd8-9b01-9490fb97978b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updated VIF entry in instance network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.393 187518 DEBUG nova.network.neutron [req-e113ff35-12e6-4b54-b587-9eea4fbab60b req-5259a8b9-4de9-4cd8-9b01-9490fb97978b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updating instance_info_cache with network_info: [{"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.417 187518 DEBUG oslo_concurrency.lockutils [req-e113ff35-12e6-4b54-b587-9eea4fbab60b req-5259a8b9-4de9-4cd8-9b01-9490fb97978b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.852 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.994 187518 DEBUG nova.compute.manager [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-unplugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.995 187518 DEBUG oslo_concurrency.lockutils [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.995 187518 DEBUG oslo_concurrency.lockutils [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.996 187518 DEBUG oslo_concurrency.lockutils [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.996 187518 DEBUG nova.compute.manager [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] No waiting events found dispatching network-vif-unplugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.997 187518 WARNING nova.compute.manager [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received unexpected event network-vif-unplugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.997 187518 DEBUG nova.compute.manager [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.998 187518 DEBUG oslo_concurrency.lockutils [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.998 187518 DEBUG oslo_concurrency.lockutils [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.998 187518 DEBUG oslo_concurrency.lockutils [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:12 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.999 187518 DEBUG nova.compute.manager [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] No waiting events found dispatching network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:13 np0005539279 nova_compute[187514]: 2025-11-29 01:03:12.999 187518 WARNING nova.compute.manager [req-afb2caad-1bad-4518-a218-c5a172a7f83a req-b9137979-9102-49e3-a905-6ccfa0dbb0f2 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received unexpected event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:03:13 np0005539279 nova_compute[187514]: 2025-11-29 01:03:13.124 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:13 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:13.697 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:13 np0005539279 nova_compute[187514]: 2025-11-29 01:03:13.905 187518 INFO nova.compute.manager [None req-b04ab63d-61b5-4551-9a76-02a801cf9014 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Get console output#033[00m
Nov 28 20:03:13 np0005539279 nova_compute[187514]: 2025-11-29 01:03:13.912 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 20:03:14 np0005539279 nova_compute[187514]: 2025-11-29 01:03:14.970 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "1f1e4b57-b962-46c9-a111-b97078141733" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:14 np0005539279 nova_compute[187514]: 2025-11-29 01:03:14.970 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:14 np0005539279 nova_compute[187514]: 2025-11-29 01:03:14.971 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "1f1e4b57-b962-46c9-a111-b97078141733-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:14 np0005539279 nova_compute[187514]: 2025-11-29 01:03:14.971 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:14 np0005539279 nova_compute[187514]: 2025-11-29 01:03:14.972 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:14 np0005539279 nova_compute[187514]: 2025-11-29 01:03:14.974 187518 INFO nova.compute.manager [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Terminating instance#033[00m
Nov 28 20:03:14 np0005539279 nova_compute[187514]: 2025-11-29 01:03:14.976 187518 DEBUG nova.compute.manager [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 20:03:14 np0005539279 kernel: tap3a771fda-08 (unregistering): left promiscuous mode
Nov 28 20:03:15 np0005539279 NetworkManager[55703]: <info>  [1764378195.0017] device (tap3a771fda-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.017 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:15 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:15Z|00157|binding|INFO|Releasing lport 3a771fda-08d7-42a6-b958-171092527357 from this chassis (sb_readonly=0)
Nov 28 20:03:15 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:15Z|00158|binding|INFO|Setting lport 3a771fda-08d7-42a6-b958-171092527357 down in Southbound
Nov 28 20:03:15 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:15Z|00159|binding|INFO|Removing iface tap3a771fda-08 ovn-installed in OVS
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.021 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.039 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.040 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:5f:00 10.100.0.6'], port_security=['fa:16:3e:5f:5f:00 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1f1e4b57-b962-46c9-a111-b97078141733', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': '849bd5a6-43c5-489b-8d1f-770fa48eaff6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=135d424c-826f-4d0f-bd1e-5a6354f7c71b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=3a771fda-08d7-42a6-b958-171092527357) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.043 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 3a771fda-08d7-42a6-b958-171092527357 in datapath f68e6d0c-8d35-4ac7-800e-d8d5def4a774 unbound from our chassis#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.045 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f68e6d0c-8d35-4ac7-800e-d8d5def4a774#033[00m
Nov 28 20:03:15 np0005539279 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 28 20:03:15 np0005539279 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 13.233s CPU time.
Nov 28 20:03:15 np0005539279 systemd-machined[153752]: Machine qemu-12-instance-0000000c terminated.
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.078 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[06386d61-3e52-4f8d-96ef-866690ff0f1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.089 187518 DEBUG nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.089 187518 DEBUG nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing instance network info cache due to event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.090 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.090 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.090 187518 DEBUG nova.network.neutron [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.128 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[a9065e14-a8c7-4554-8f57-3470c1eb3740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.131 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[30932a93-5bc8-4f63-ae28-e81b974aa60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.185 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f8a067-f5c0-4ad9-a1a4-a80177ba11d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.214 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[915490e0-622b-464f-a8f7-9447a615429b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf68e6d0c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:b9:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408685, 'reachable_time': 35944, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219625, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.239 187518 DEBUG nova.compute.manager [req-0e4fd6f3-c030-4c7c-8238-d9363b7d82f8 req-19c1a7dd-e3df-4551-a486-f61ebf0b13a6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-vif-unplugged-3a771fda-08d7-42a6-b958-171092527357 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.239 187518 DEBUG oslo_concurrency.lockutils [req-0e4fd6f3-c030-4c7c-8238-d9363b7d82f8 req-19c1a7dd-e3df-4551-a486-f61ebf0b13a6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "1f1e4b57-b962-46c9-a111-b97078141733-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.240 187518 DEBUG oslo_concurrency.lockutils [req-0e4fd6f3-c030-4c7c-8238-d9363b7d82f8 req-19c1a7dd-e3df-4551-a486-f61ebf0b13a6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.240 187518 DEBUG oslo_concurrency.lockutils [req-0e4fd6f3-c030-4c7c-8238-d9363b7d82f8 req-19c1a7dd-e3df-4551-a486-f61ebf0b13a6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.240 187518 DEBUG nova.compute.manager [req-0e4fd6f3-c030-4c7c-8238-d9363b7d82f8 req-19c1a7dd-e3df-4551-a486-f61ebf0b13a6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] No waiting events found dispatching network-vif-unplugged-3a771fda-08d7-42a6-b958-171092527357 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.241 187518 DEBUG nova.compute.manager [req-0e4fd6f3-c030-4c7c-8238-d9363b7d82f8 req-19c1a7dd-e3df-4551-a486-f61ebf0b13a6 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-vif-unplugged-3a771fda-08d7-42a6-b958-171092527357 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.246 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[70ffa12f-3684-41a0-af27-97605a55c0f2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf68e6d0c-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408701, 'tstamp': 408701}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219633, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf68e6d0c-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408705, 'tstamp': 408705}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219633, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.249 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf68e6d0c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.250 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.258 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.259 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf68e6d0c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.261 187518 INFO nova.virt.libvirt.driver [-] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Instance destroyed successfully.#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.260 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.261 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf68e6d0c-80, col_values=(('external_ids', {'iface-id': '3b1483fb-3e2e-4ef3-97c1-8cc0ddc07ca1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:15 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:15.261 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.261 187518 DEBUG nova.objects.instance [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid 1f1e4b57-b962-46c9-a111-b97078141733 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.277 187518 DEBUG nova.virt.libvirt.vif [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T01:02:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1767310003',display_name='tempest-TestNetworkBasicOps-server-1767310003',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1767310003',id=12,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIXY4rDSKA0ybeOn3KCfbAKsWL18lhNQOnDuTytFCUa2G12iUxsE8uFWJRWXPHPlQQFcCmrv2mTrqLXqVy76CFkTaDyN5CkBoo5uXSeYWqICjabH1zc21hLQ9G9uXim5Cw==',key_name='tempest-TestNetworkBasicOps-346262234',keypairs=<?>,launch_index=0,launched_at=2025-11-29T01:02:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-kca1lrf9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T01:02:49Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=1f1e4b57-b962-46c9-a111-b97078141733,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.277 187518 DEBUG nova.network.os_vif_util [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "3a771fda-08d7-42a6-b958-171092527357", "address": "fa:16:3e:5f:5f:00", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3a771fda-08", "ovs_interfaceid": "3a771fda-08d7-42a6-b958-171092527357", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.278 187518 DEBUG nova.network.os_vif_util [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:5f:00,bridge_name='br-int',has_traffic_filtering=True,id=3a771fda-08d7-42a6-b958-171092527357,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a771fda-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.279 187518 DEBUG os_vif [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:5f:00,bridge_name='br-int',has_traffic_filtering=True,id=3a771fda-08d7-42a6-b958-171092527357,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a771fda-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.281 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.282 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3a771fda-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.285 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.292 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.297 187518 INFO os_vif [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:5f:00,bridge_name='br-int',has_traffic_filtering=True,id=3a771fda-08d7-42a6-b958-171092527357,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3a771fda-08')#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.298 187518 INFO nova.virt.libvirt.driver [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Deleting instance files /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733_del#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.299 187518 INFO nova.virt.libvirt.driver [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Deletion of /var/lib/nova/instances/1f1e4b57-b962-46c9-a111-b97078141733_del complete#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.366 187518 INFO nova.compute.manager [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.367 187518 DEBUG oslo.service.loopingcall [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.367 187518 DEBUG nova.compute.manager [-] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 20:03:15 np0005539279 nova_compute[187514]: 2025-11-29 01:03:15.367 187518 DEBUG nova.network.neutron [-] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 20:03:17 np0005539279 nova_compute[187514]: 2025-11-29 01:03:17.366 187518 DEBUG nova.compute.manager [req-1d6f643b-75c1-429a-8dd7-f85d2c7cc360 req-69c2c318-d9f6-4061-a2a0-611037cf888b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:17 np0005539279 nova_compute[187514]: 2025-11-29 01:03:17.367 187518 DEBUG oslo_concurrency.lockutils [req-1d6f643b-75c1-429a-8dd7-f85d2c7cc360 req-69c2c318-d9f6-4061-a2a0-611037cf888b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "1f1e4b57-b962-46c9-a111-b97078141733-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:17 np0005539279 nova_compute[187514]: 2025-11-29 01:03:17.367 187518 DEBUG oslo_concurrency.lockutils [req-1d6f643b-75c1-429a-8dd7-f85d2c7cc360 req-69c2c318-d9f6-4061-a2a0-611037cf888b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:17 np0005539279 nova_compute[187514]: 2025-11-29 01:03:17.368 187518 DEBUG oslo_concurrency.lockutils [req-1d6f643b-75c1-429a-8dd7-f85d2c7cc360 req-69c2c318-d9f6-4061-a2a0-611037cf888b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:17 np0005539279 nova_compute[187514]: 2025-11-29 01:03:17.368 187518 DEBUG nova.compute.manager [req-1d6f643b-75c1-429a-8dd7-f85d2c7cc360 req-69c2c318-d9f6-4061-a2a0-611037cf888b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] No waiting events found dispatching network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:17 np0005539279 nova_compute[187514]: 2025-11-29 01:03:17.369 187518 WARNING nova.compute.manager [req-1d6f643b-75c1-429a-8dd7-f85d2c7cc360 req-69c2c318-d9f6-4061-a2a0-611037cf888b 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received unexpected event network-vif-plugged-3a771fda-08d7-42a6-b958-171092527357 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 20:03:17 np0005539279 nova_compute[187514]: 2025-11-29 01:03:17.855 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.703 187518 DEBUG nova.network.neutron [-] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.736 187518 INFO nova.compute.manager [-] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Took 3.37 seconds to deallocate network for instance.#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.802 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.803 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.894 187518 DEBUG nova.compute.provider_tree [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.913 187518 DEBUG nova.scheduler.client.report [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.949 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.956 187518 DEBUG nova.network.neutron [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updated VIF entry in instance network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.957 187518 DEBUG nova.network.neutron [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updating instance_info_cache with network_info: [{"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.974 187518 INFO nova.scheduler.client.report [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance 1f1e4b57-b962-46c9-a111-b97078141733#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.979 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.980 187518 DEBUG nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.980 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.980 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.981 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.981 187518 DEBUG nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] No waiting events found dispatching network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.982 187518 WARNING nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received unexpected event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.982 187518 DEBUG nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.983 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.983 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.983 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.984 187518 DEBUG nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] No waiting events found dispatching network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.984 187518 WARNING nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received unexpected event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 for instance with vm_state active and task_state None.#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.985 187518 DEBUG nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-changed-3a771fda-08d7-42a6-b958-171092527357 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.985 187518 DEBUG nova.compute.manager [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Refreshing instance network info cache due to event network-changed-3a771fda-08d7-42a6-b958-171092527357. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.985 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.986 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:03:18 np0005539279 nova_compute[187514]: 2025-11-29 01:03:18.986 187518 DEBUG nova.network.neutron [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Refreshing network info cache for port 3a771fda-08d7-42a6-b958-171092527357 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:03:19 np0005539279 nova_compute[187514]: 2025-11-29 01:03:19.051 187518 DEBUG oslo_concurrency.lockutils [None req-cb32a91e-613c-45b1-9f1b-4ca0f2abbba0 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "1f1e4b57-b962-46c9-a111-b97078141733" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:19 np0005539279 nova_compute[187514]: 2025-11-29 01:03:19.115 187518 DEBUG nova.network.neutron [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 20:03:19 np0005539279 nova_compute[187514]: 2025-11-29 01:03:19.432 187518 DEBUG nova.compute.manager [req-cec1dfcc-faa7-4829-9896-bf7442aeb220 req-4802069d-cc62-4b42-bcd1-ad2d2fc06007 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Received event network-vif-deleted-3a771fda-08d7-42a6-b958-171092527357 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:19 np0005539279 nova_compute[187514]: 2025-11-29 01:03:19.572 187518 DEBUG nova.network.neutron [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:19 np0005539279 nova_compute[187514]: 2025-11-29 01:03:19.593 187518 DEBUG oslo_concurrency.lockutils [req-7e257e03-62a5-4f09-b783-ff9b898f9c7d req-77ca9692-ee3b-486a-a1b2-6214970b2fc5 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-1f1e4b57-b962-46c9-a111-b97078141733" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:03:20 np0005539279 nova_compute[187514]: 2025-11-29 01:03:20.284 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.788 187518 DEBUG nova.compute.manager [req-c32b1ab6-c1ab-43af-a248-f08fc39c72f2 req-f8ecf3df-83bf-435a-a751-9d3d5a73c046 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.788 187518 DEBUG nova.compute.manager [req-c32b1ab6-c1ab-43af-a248-f08fc39c72f2 req-f8ecf3df-83bf-435a-a751-9d3d5a73c046 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing instance network info cache due to event network-changed-4b2bcfa2-03bd-4475-b868-15c0531e30d4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.789 187518 DEBUG oslo_concurrency.lockutils [req-c32b1ab6-c1ab-43af-a248-f08fc39c72f2 req-f8ecf3df-83bf-435a-a751-9d3d5a73c046 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.789 187518 DEBUG oslo_concurrency.lockutils [req-c32b1ab6-c1ab-43af-a248-f08fc39c72f2 req-f8ecf3df-83bf-435a-a751-9d3d5a73c046 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.789 187518 DEBUG nova.network.neutron [req-c32b1ab6-c1ab-43af-a248-f08fc39c72f2 req-f8ecf3df-83bf-435a-a751-9d3d5a73c046 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Refreshing network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:03:21 np0005539279 podman[219645]: 2025-11-29 01:03:21.884262177 +0000 UTC m=+0.110200151 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 20:03:21 np0005539279 podman[219646]: 2025-11-29 01:03:21.890712704 +0000 UTC m=+0.109432498 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.898 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.899 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.899 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.900 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.900 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:21 np0005539279 podman[219644]: 2025-11-29 01:03:21.900800887 +0000 UTC m=+0.126740061 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.902 187518 INFO nova.compute.manager [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Terminating instance#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.903 187518 DEBUG nova.compute.manager [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 20:03:21 np0005539279 kernel: tap4b2bcfa2-03 (unregistering): left promiscuous mode
Nov 28 20:03:21 np0005539279 NetworkManager[55703]: <info>  [1764378201.9345] device (tap4b2bcfa2-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:03:21 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:21Z|00160|binding|INFO|Releasing lport 4b2bcfa2-03bd-4475-b868-15c0531e30d4 from this chassis (sb_readonly=0)
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.942 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:21 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:21Z|00161|binding|INFO|Setting lport 4b2bcfa2-03bd-4475-b868-15c0531e30d4 down in Southbound
Nov 28 20:03:21 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:21Z|00162|binding|INFO|Removing iface tap4b2bcfa2-03 ovn-installed in OVS
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.946 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:21.952 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:25:7c 10.100.0.14'], port_security=['fa:16:3e:8c:25:7c 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '4934aff6-60d5-416e-97b3-bed2dbe82055', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '8', 'neutron:security_group_ids': '620dc653-8302-473f-918a-b571669e756e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=135d424c-826f-4d0f-bd1e-5a6354f7c71b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=4b2bcfa2-03bd-4475-b868-15c0531e30d4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:03:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:21.954 104584 INFO neutron.agent.ovn.metadata.agent [-] Port 4b2bcfa2-03bd-4475-b868-15c0531e30d4 in datapath f68e6d0c-8d35-4ac7-800e-d8d5def4a774 unbound from our chassis#033[00m
Nov 28 20:03:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:21.956 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f68e6d0c-8d35-4ac7-800e-d8d5def4a774, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 20:03:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:21.958 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[799529ef-1295-40a5-9b96-8b53bd8b7ebb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:21.959 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774 namespace which is not needed anymore#033[00m
Nov 28 20:03:21 np0005539279 nova_compute[187514]: 2025-11-29 01:03:21.961 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:21 np0005539279 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 28 20:03:21 np0005539279 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 13.556s CPU time.
Nov 28 20:03:21 np0005539279 systemd-machined[153752]: Machine qemu-11-instance-0000000b terminated.
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.142 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:22 np0005539279 neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774[219292]: [NOTICE]   (219296) : haproxy version is 2.8.14-c23fe91
Nov 28 20:03:22 np0005539279 neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774[219292]: [NOTICE]   (219296) : path to executable is /usr/sbin/haproxy
Nov 28 20:03:22 np0005539279 neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774[219292]: [WARNING]  (219296) : Exiting Master process...
Nov 28 20:03:22 np0005539279 neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774[219292]: [ALERT]    (219296) : Current worker (219299) exited with code 143 (Terminated)
Nov 28 20:03:22 np0005539279 neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774[219292]: [WARNING]  (219296) : All workers exited. Exiting... (0)
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.150 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:22 np0005539279 systemd[1]: libpod-83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe.scope: Deactivated successfully.
Nov 28 20:03:22 np0005539279 podman[219731]: 2025-11-29 01:03:22.15575468 +0000 UTC m=+0.068871321 container died 83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:03:22 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe-userdata-shm.mount: Deactivated successfully.
Nov 28 20:03:22 np0005539279 systemd[1]: var-lib-containers-storage-overlay-3d52db0c140c40629cb7522c7c735f9c837b2409735c59c793b6741a0bc7ef39-merged.mount: Deactivated successfully.
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.201 187518 INFO nova.virt.libvirt.driver [-] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Instance destroyed successfully.#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.201 187518 DEBUG nova.objects.instance [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid 4934aff6-60d5-416e-97b3-bed2dbe82055 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:03:22 np0005539279 podman[219731]: 2025-11-29 01:03:22.212844747 +0000 UTC m=+0.125961398 container cleanup 83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.221 187518 DEBUG nova.virt.libvirt.vif [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T01:02:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1184505323',display_name='tempest-TestNetworkBasicOps-server-1184505323',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1184505323',id=11,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKg+pcfICyoy0PdUh7IAeh34OcMnvlgfDaE12t4Y/9/QjK5SA5Wnh43QNi8YWI2ahCwmMnrIQ65ramNfTIFEmyqXTFXzbnkjufkp+4+PrWJhxjICuVaXjNX4nlafiL43BQ==',key_name='tempest-TestNetworkBasicOps-1133988200',keypairs=<?>,launch_index=0,launched_at=2025-11-29T01:02:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-bkbw0tu9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T01:02:33Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=4934aff6-60d5-416e-97b3-bed2dbe82055,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.222 187518 DEBUG nova.network.os_vif_util [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.223 187518 DEBUG nova.network.os_vif_util [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:25:7c,bridge_name='br-int',has_traffic_filtering=True,id=4b2bcfa2-03bd-4475-b868-15c0531e30d4,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2bcfa2-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.223 187518 DEBUG os_vif [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:25:7c,bridge_name='br-int',has_traffic_filtering=True,id=4b2bcfa2-03bd-4475-b868-15c0531e30d4,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2bcfa2-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.225 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.225 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b2bcfa2-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.227 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.230 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.232 187518 INFO os_vif [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:25:7c,bridge_name='br-int',has_traffic_filtering=True,id=4b2bcfa2-03bd-4475-b868-15c0531e30d4,network=Network(f68e6d0c-8d35-4ac7-800e-d8d5def4a774),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b2bcfa2-03')#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.233 187518 INFO nova.virt.libvirt.driver [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Deleting instance files /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055_del#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.233 187518 INFO nova.virt.libvirt.driver [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Deletion of /var/lib/nova/instances/4934aff6-60d5-416e-97b3-bed2dbe82055_del complete#033[00m
Nov 28 20:03:22 np0005539279 systemd[1]: libpod-conmon-83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe.scope: Deactivated successfully.
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.295 187518 INFO nova.compute.manager [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.296 187518 DEBUG oslo.service.loopingcall [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.296 187518 DEBUG nova.compute.manager [-] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.297 187518 DEBUG nova.network.neutron [-] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 20:03:22 np0005539279 podman[219775]: 2025-11-29 01:03:22.318583077 +0000 UTC m=+0.064188404 container remove 83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.326 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[21a036f8-d5b3-4bb0-a77d-7468211be595]: (4, ('Sat Nov 29 01:03:22 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774 (83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe)\n83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe\nSat Nov 29 01:03:22 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774 (83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe)\n83f5a069a3cc29a099d66b50fdda694cec130aeb8fc76007ea6840d0596627fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.329 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[17a3c86c-6916-46c7-ab2b-80dcc9c66a26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.330 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf68e6d0c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:22 np0005539279 kernel: tapf68e6d0c-80: left promiscuous mode
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.333 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.349 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.352 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[5fab885b-2a7c-4cfd-86d4-94e4602dd7f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.377 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[23ed3de0-207a-49a3-b3bc-35e96f4b6003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.378 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[964aa8ab-3190-4c76-aa04-3d939da6cba4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.406 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc698a0-33e6-4a11-b6a8-ad7d688a7a2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408676, 'reachable_time': 28140, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219791, 'error': None, 'target': 'ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.412 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f68e6d0c-8d35-4ac7-800e-d8d5def4a774 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 20:03:22 np0005539279 systemd[1]: run-netns-ovnmeta\x2df68e6d0c\x2d8d35\x2d4ac7\x2d800e\x2dd8d5def4a774.mount: Deactivated successfully.
Nov 28 20:03:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:22.412 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[25ec7254-0117-444d-a783-f66eed7d13ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.630 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.631 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.631 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 20:03:22 np0005539279 nova_compute[187514]: 2025-11-29 01:03:22.846 187518 DEBUG nova.network.neutron [-] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.062 187518 DEBUG nova.network.neutron [req-c32b1ab6-c1ab-43af-a248-f08fc39c72f2 req-f8ecf3df-83bf-435a-a751-9d3d5a73c046 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updated VIF entry in instance network info cache for port 4b2bcfa2-03bd-4475-b868-15c0531e30d4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.063 187518 DEBUG nova.network.neutron [req-c32b1ab6-c1ab-43af-a248-f08fc39c72f2 req-f8ecf3df-83bf-435a-a751-9d3d5a73c046 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updating instance_info_cache with network_info: [{"id": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "address": "fa:16:3e:8c:25:7c", "network": {"id": "f68e6d0c-8d35-4ac7-800e-d8d5def4a774", "bridge": "br-int", "label": "tempest-network-smoke--1039480258", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b2bcfa2-03", "ovs_interfaceid": "4b2bcfa2-03bd-4475-b868-15c0531e30d4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.065 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.067 187518 DEBUG nova.compute.manager [req-03388d39-32f3-493b-8151-966bb081456a req-947f1171-27b0-4571-9551-7504cfaf138e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-deleted-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.067 187518 INFO nova.compute.manager [req-03388d39-32f3-493b-8151-966bb081456a req-947f1171-27b0-4571-9551-7504cfaf138e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Neutron deleted interface 4b2bcfa2-03bd-4475-b868-15c0531e30d4; detaching it from the instance and deleting it from the info cache#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.068 187518 DEBUG nova.network.neutron [req-03388d39-32f3-493b-8151-966bb081456a req-947f1171-27b0-4571-9551-7504cfaf138e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.098 187518 INFO nova.compute.manager [-] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Took 0.80 seconds to deallocate network for instance.#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.103 187518 DEBUG oslo_concurrency.lockutils [req-c32b1ab6-c1ab-43af-a248-f08fc39c72f2 req-f8ecf3df-83bf-435a-a751-9d3d5a73c046 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-4934aff6-60d5-416e-97b3-bed2dbe82055" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.154 187518 DEBUG nova.compute.manager [req-03388d39-32f3-493b-8151-966bb081456a req-947f1171-27b0-4571-9551-7504cfaf138e 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Detach interface failed, port_id=4b2bcfa2-03bd-4475-b868-15c0531e30d4, reason: Instance 4934aff6-60d5-416e-97b3-bed2dbe82055 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.156 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.156 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.399 187518 DEBUG nova.compute.provider_tree [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.422 187518 DEBUG nova.scheduler.client.report [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.456 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.556 187518 INFO nova.scheduler.client.report [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance 4934aff6-60d5-416e-97b3-bed2dbe82055#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.646 187518 DEBUG oslo_concurrency.lockutils [None req-d0b96ef1-128e-42b9-83ef-f67fb848caae 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.648 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.877 187518 DEBUG nova.compute.manager [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-unplugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.878 187518 DEBUG oslo_concurrency.lockutils [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.879 187518 DEBUG oslo_concurrency.lockutils [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.879 187518 DEBUG oslo_concurrency.lockutils [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.879 187518 DEBUG nova.compute.manager [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] No waiting events found dispatching network-vif-unplugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.880 187518 WARNING nova.compute.manager [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received unexpected event network-vif-unplugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.880 187518 DEBUG nova.compute.manager [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.880 187518 DEBUG oslo_concurrency.lockutils [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.881 187518 DEBUG oslo_concurrency.lockutils [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.881 187518 DEBUG oslo_concurrency.lockutils [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "4934aff6-60d5-416e-97b3-bed2dbe82055-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.881 187518 DEBUG nova.compute.manager [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] No waiting events found dispatching network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:23 np0005539279 nova_compute[187514]: 2025-11-29 01:03:23.882 187518 WARNING nova.compute.manager [req-373cd2de-a26a-4f4e-8521-ff2c9a2464a8 req-1ba15776-0385-487c-bcac-8f36ee415dde 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Received unexpected event network-vif-plugged-4b2bcfa2-03bd-4475-b868-15c0531e30d4 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.637 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.638 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.639 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.639 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.897 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.900 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5752MB free_disk=73.33921813964844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.900 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.900 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.958 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.959 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:03:25 np0005539279 nova_compute[187514]: 2025-11-29 01:03:25.992 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:03:26 np0005539279 nova_compute[187514]: 2025-11-29 01:03:26.011 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:03:26 np0005539279 nova_compute[187514]: 2025-11-29 01:03:26.043 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:03:26 np0005539279 nova_compute[187514]: 2025-11-29 01:03:26.043 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:26 np0005539279 nova_compute[187514]: 2025-11-29 01:03:26.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:27 np0005539279 nova_compute[187514]: 2025-11-29 01:03:27.229 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:27 np0005539279 nova_compute[187514]: 2025-11-29 01:03:27.860 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:28 np0005539279 nova_compute[187514]: 2025-11-29 01:03:28.652 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:28 np0005539279 nova_compute[187514]: 2025-11-29 01:03:28.652 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:03:28 np0005539279 nova_compute[187514]: 2025-11-29 01:03:28.652 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:03:28 np0005539279 nova_compute[187514]: 2025-11-29 01:03:28.673 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:03:28 np0005539279 nova_compute[187514]: 2025-11-29 01:03:28.673 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:30 np0005539279 nova_compute[187514]: 2025-11-29 01:03:30.139 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:30 np0005539279 nova_compute[187514]: 2025-11-29 01:03:30.279 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764378195.25675, 1f1e4b57-b962-46c9-a111-b97078141733 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:03:30 np0005539279 nova_compute[187514]: 2025-11-29 01:03:30.280 187518 INFO nova.compute.manager [-] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] VM Stopped (Lifecycle Event)#033[00m
Nov 28 20:03:30 np0005539279 nova_compute[187514]: 2025-11-29 01:03:30.282 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:30 np0005539279 nova_compute[187514]: 2025-11-29 01:03:30.302 187518 DEBUG nova.compute.manager [None req-325f8313-3afb-4bf1-ac32-1916400999dd - - - - - -] [instance: 1f1e4b57-b962-46c9-a111-b97078141733] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:03:30 np0005539279 nova_compute[187514]: 2025-11-29 01:03:30.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:30 np0005539279 nova_compute[187514]: 2025-11-29 01:03:30.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:03:32 np0005539279 nova_compute[187514]: 2025-11-29 01:03:32.233 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:03:32.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:03:32 np0005539279 nova_compute[187514]: 2025-11-29 01:03:32.861 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:33 np0005539279 nova_compute[187514]: 2025-11-29 01:03:33.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:33 np0005539279 nova_compute[187514]: 2025-11-29 01:03:33.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:33 np0005539279 podman[219797]: 2025-11-29 01:03:33.829809363 +0000 UTC m=+0.072762844 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 20:03:33 np0005539279 podman[219796]: 2025-11-29 01:03:33.849437618 +0000 UTC m=+0.100014978 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Nov 28 20:03:35 np0005539279 nova_compute[187514]: 2025-11-29 01:03:35.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:37 np0005539279 nova_compute[187514]: 2025-11-29 01:03:37.198 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764378202.197357, 4934aff6-60d5-416e-97b3-bed2dbe82055 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:03:37 np0005539279 nova_compute[187514]: 2025-11-29 01:03:37.199 187518 INFO nova.compute.manager [-] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] VM Stopped (Lifecycle Event)#033[00m
Nov 28 20:03:37 np0005539279 nova_compute[187514]: 2025-11-29 01:03:37.237 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:37 np0005539279 nova_compute[187514]: 2025-11-29 01:03:37.244 187518 DEBUG nova.compute.manager [None req-5026beab-a5f6-435a-b593-f6b99e0028e6 - - - - - -] [instance: 4934aff6-60d5-416e-97b3-bed2dbe82055] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:03:37 np0005539279 nova_compute[187514]: 2025-11-29 01:03:37.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:03:37 np0005539279 nova_compute[187514]: 2025-11-29 01:03:37.904 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:38 np0005539279 podman[219845]: 2025-11-29 01:03:38.861253887 +0000 UTC m=+0.088637571 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:03:38 np0005539279 podman[219844]: 2025-11-29 01:03:38.915391824 +0000 UTC m=+0.148173003 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 20:03:42 np0005539279 nova_compute[187514]: 2025-11-29 01:03:42.241 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:42 np0005539279 nova_compute[187514]: 2025-11-29 01:03:42.908 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:42 np0005539279 nova_compute[187514]: 2025-11-29 01:03:42.974 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:42 np0005539279 nova_compute[187514]: 2025-11-29 01:03:42.975 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:42 np0005539279 nova_compute[187514]: 2025-11-29 01:03:42.993 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.088 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.088 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.100 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.101 187518 INFO nova.compute.claims [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Claim successful on node compute-0.ctlplane.example.com#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.338 187518 DEBUG nova.compute.provider_tree [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.366 187518 DEBUG nova.scheduler.client.report [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.393 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.305s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.394 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.470 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.471 187518 DEBUG nova.network.neutron [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.502 187518 INFO nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.529 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.647 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.649 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.650 187518 INFO nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Creating image(s)#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.651 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "/var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.652 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.653 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "/var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.677 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.773 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.775 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.776 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.798 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.876 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.878 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.933 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f,backing_fmt=raw /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.935 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "6fb42ae74ead6d4553428b24aefa9862c3ae2e5f" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:43 np0005539279 nova_compute[187514]: 2025-11-29 01:03:43.935 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.003 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/6fb42ae74ead6d4553428b24aefa9862c3ae2e5f --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.005 187518 DEBUG nova.virt.disk.api [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Checking if we can resize image /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.006 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.101 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.102 187518 DEBUG nova.virt.disk.api [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Cannot resize image /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.103 187518 DEBUG nova.objects.instance [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'migration_context' on Instance uuid 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.119 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.120 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Ensure instance console log exists: /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.121 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.121 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.122 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:44 np0005539279 nova_compute[187514]: 2025-11-29 01:03:44.497 187518 DEBUG nova.policy [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1680be98de9e48a19f46eb0bbdfec6fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 20:03:47 np0005539279 nova_compute[187514]: 2025-11-29 01:03:47.245 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:47 np0005539279 nova_compute[187514]: 2025-11-29 01:03:47.940 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:48 np0005539279 nova_compute[187514]: 2025-11-29 01:03:48.725 187518 DEBUG nova.network.neutron [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Successfully created port: b7f6b0cd-1f1b-4bb3-abcd-720615d7920a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 20:03:50 np0005539279 nova_compute[187514]: 2025-11-29 01:03:50.593 187518 DEBUG nova.network.neutron [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Successfully updated port: b7f6b0cd-1f1b-4bb3-abcd-720615d7920a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 20:03:50 np0005539279 nova_compute[187514]: 2025-11-29 01:03:50.613 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:03:50 np0005539279 nova_compute[187514]: 2025-11-29 01:03:50.613 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquired lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:03:50 np0005539279 nova_compute[187514]: 2025-11-29 01:03:50.613 187518 DEBUG nova.network.neutron [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 20:03:50 np0005539279 nova_compute[187514]: 2025-11-29 01:03:50.730 187518 DEBUG nova.compute.manager [req-157daa45-0c8c-4797-9e18-46a3934f637d req-7ea9a11c-ac13-4c1d-9586-7895a1c81fd7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-changed-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:50 np0005539279 nova_compute[187514]: 2025-11-29 01:03:50.730 187518 DEBUG nova.compute.manager [req-157daa45-0c8c-4797-9e18-46a3934f637d req-7ea9a11c-ac13-4c1d-9586-7895a1c81fd7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Refreshing instance network info cache due to event network-changed-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:03:50 np0005539279 nova_compute[187514]: 2025-11-29 01:03:50.730 187518 DEBUG oslo_concurrency.lockutils [req-157daa45-0c8c-4797-9e18-46a3934f637d req-7ea9a11c-ac13-4c1d-9586-7895a1c81fd7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:03:50 np0005539279 nova_compute[187514]: 2025-11-29 01:03:50.779 187518 DEBUG nova.network.neutron [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.139 187518 DEBUG nova.network.neutron [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Updating instance_info_cache with network_info: [{"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.169 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Releasing lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.170 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Instance network_info: |[{"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.171 187518 DEBUG oslo_concurrency.lockutils [req-157daa45-0c8c-4797-9e18-46a3934f637d req-7ea9a11c-ac13-4c1d-9586-7895a1c81fd7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.171 187518 DEBUG nova.network.neutron [req-157daa45-0c8c-4797-9e18-46a3934f637d req-7ea9a11c-ac13-4c1d-9586-7895a1c81fd7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Refreshing network info cache for port b7f6b0cd-1f1b-4bb3-abcd-720615d7920a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.176 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Start _get_guest_xml network_info=[{"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'encryption_format': None, 'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'encrypted': False, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'guest_format': None, 'image_id': '017f04d5-006e-46df-a06f-ac852f70dddf'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.184 187518 WARNING nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.189 187518 DEBUG nova.virt.libvirt.host [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.190 187518 DEBUG nova.virt.libvirt.host [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.195 187518 DEBUG nova.virt.libvirt.host [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.196 187518 DEBUG nova.virt.libvirt.host [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.196 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.197 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-29T00:52:59Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6ce17e5f-9ac5-497d-adc9-1357453b4367',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-29T00:53:00Z,direct_url=<?>,disk_format='qcow2',id=017f04d5-006e-46df-a06f-ac852f70dddf,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='6eabfaada87c45439569e038a74b4318',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-29T00:53:01Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.197 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.197 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.198 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.198 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.198 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.199 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.199 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.199 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.199 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.199 187518 DEBUG nova.virt.hardware [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.204 187518 DEBUG nova.virt.libvirt.vif [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:03:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1352240053',display_name='tempest-TestNetworkBasicOps-server-1352240053',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1352240053',id=13,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIbqU/+Ly64ZgevQM1MV/N1d9/dd7HZ9Go69ePa6+HhFuPM6aom4NhK8S663PTiK4LYIxQ4b42dznmCYdks0H80tHFS0HWc4yknTnsj0Te+vyv49ABIJi5J78rVHIlIZqw==',key_name='tempest-TestNetworkBasicOps-1947031926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-nx2cxbr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:03:43Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.204 187518 DEBUG nova.network.os_vif_util [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.206 187518 DEBUG nova.network.os_vif_util [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:96:13,bridge_name='br-int',has_traffic_filtering=True,id=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a,network=Network(93624e72-4991-4db9-bfa5-cd34c3d50e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f6b0cd-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.207 187518 DEBUG nova.objects.instance [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.225 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] End _get_guest_xml xml=<domain type="kvm">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <uuid>2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502</uuid>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <name>instance-0000000d</name>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <memory>131072</memory>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <vcpu>1</vcpu>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <metadata>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <nova:name>tempest-TestNetworkBasicOps-server-1352240053</nova:name>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <nova:creationTime>2025-11-29 01:03:52</nova:creationTime>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <nova:flavor name="m1.nano">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        <nova:memory>128</nova:memory>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        <nova:disk>1</nova:disk>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        <nova:swap>0</nova:swap>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        <nova:vcpus>1</nova:vcpus>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      </nova:flavor>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <nova:owner>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        <nova:user uuid="1680be98de9e48a19f46eb0bbdfec6fa">tempest-TestNetworkBasicOps-1523736817-project-member</nova:user>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        <nova:project uuid="0df0de37c7d74836a2135b0d6ff3a067">tempest-TestNetworkBasicOps-1523736817</nova:project>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      </nova:owner>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <nova:root type="image" uuid="017f04d5-006e-46df-a06f-ac852f70dddf"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <nova:ports>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        <nova:port uuid="b7f6b0cd-1f1b-4bb3-abcd-720615d7920a">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:        </nova:port>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      </nova:ports>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </nova:instance>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  </metadata>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <sysinfo type="smbios">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <system>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <entry name="manufacturer">RDO</entry>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <entry name="product">OpenStack Compute</entry>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <entry name="serial">2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502</entry>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <entry name="uuid">2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502</entry>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <entry name="family">Virtual Machine</entry>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </system>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  </sysinfo>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <os>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <boot dev="hd"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <smbios mode="sysinfo"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  </os>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <features>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <acpi/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <apic/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <vmcoreinfo/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  </features>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <clock offset="utc">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <timer name="hpet" present="no"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  </clock>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <cpu mode="host-model" match="exact">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  </cpu>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  <devices>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <disk type="file" device="disk">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <target dev="vda" bus="virtio"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <disk type="file" device="cdrom">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <source file="/var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk.config"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <target dev="sda" bus="sata"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </disk>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <interface type="ethernet">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <mac address="fa:16:3e:7c:96:13"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <mtu size="1442"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <target dev="tapb7f6b0cd-1f"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </interface>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <serial type="pty">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <log file="/var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/console.log" append="off"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </serial>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <video>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <model type="virtio"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </video>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <input type="tablet" bus="usb"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <rng model="virtio">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <backend model="random">/dev/urandom</backend>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </rng>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <controller type="usb" index="0"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    <memballoon model="virtio">
Nov 28 20:03:52 np0005539279 nova_compute[187514]:      <stats period="10"/>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:    </memballoon>
Nov 28 20:03:52 np0005539279 nova_compute[187514]:  </devices>
Nov 28 20:03:52 np0005539279 nova_compute[187514]: </domain>
Nov 28 20:03:52 np0005539279 nova_compute[187514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.227 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Preparing to wait for external event network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.228 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.228 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.228 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.229 187518 DEBUG nova.virt.libvirt.vif [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-29T01:03:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1352240053',display_name='tempest-TestNetworkBasicOps-server-1352240053',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1352240053',id=13,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIbqU/+Ly64ZgevQM1MV/N1d9/dd7HZ9Go69ePa6+HhFuPM6aom4NhK8S663PTiK4LYIxQ4b42dznmCYdks0H80tHFS0HWc4yknTnsj0Te+vyv49ABIJi5J78rVHIlIZqw==',key_name='tempest-TestNetworkBasicOps-1947031926',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-nx2cxbr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-29T01:03:43Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.229 187518 DEBUG nova.network.os_vif_util [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.230 187518 DEBUG nova.network.os_vif_util [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:96:13,bridge_name='br-int',has_traffic_filtering=True,id=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a,network=Network(93624e72-4991-4db9-bfa5-cd34c3d50e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f6b0cd-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.230 187518 DEBUG os_vif [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:96:13,bridge_name='br-int',has_traffic_filtering=True,id=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a,network=Network(93624e72-4991-4db9-bfa5-cd34c3d50e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f6b0cd-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.231 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.231 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.231 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.234 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.235 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7f6b0cd-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.237 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7f6b0cd-1f, col_values=(('external_ids', {'iface-id': 'b7f6b0cd-1f1b-4bb3-abcd-720615d7920a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:96:13', 'vm-uuid': '2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.238 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.239 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:03:52 np0005539279 NetworkManager[55703]: <info>  [1764378232.2397] manager: (tapb7f6b0cd-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.248 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.249 187518 INFO os_vif [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:96:13,bridge_name='br-int',has_traffic_filtering=True,id=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a,network=Network(93624e72-4991-4db9-bfa5-cd34c3d50e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f6b0cd-1f')#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.314 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.314 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.314 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] No VIF found with MAC fa:16:3e:7c:96:13, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.315 187518 INFO nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Using config drive#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.625 187518 INFO nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Creating config drive at /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk.config#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.636 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjuv1u3xo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.782 187518 DEBUG oslo_concurrency.processutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjuv1u3xo" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:03:52 np0005539279 podman[219911]: 2025-11-29 01:03:52.84809149 +0000 UTC m=+0.090422352 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:03:52 np0005539279 podman[219910]: 2025-11-29 01:03:52.875035935 +0000 UTC m=+0.112836967 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 20:03:52 np0005539279 kernel: tapb7f6b0cd-1f: entered promiscuous mode
Nov 28 20:03:52 np0005539279 podman[219912]: 2025-11-29 01:03:52.902726181 +0000 UTC m=+0.127305313 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 20:03:52 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:52Z|00163|binding|INFO|Claiming lport b7f6b0cd-1f1b-4bb3-abcd-720615d7920a for this chassis.
Nov 28 20:03:52 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:52Z|00164|binding|INFO|b7f6b0cd-1f1b-4bb3-abcd-720615d7920a: Claiming fa:16:3e:7c:96:13 10.100.0.4
Nov 28 20:03:52 np0005539279 NetworkManager[55703]: <info>  [1764378232.9345] manager: (tapb7f6b0cd-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.935 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.941 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:52 np0005539279 nova_compute[187514]: 2025-11-29 01:03:52.942 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.950 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:96:13 10.100.0.4'], port_security=['fa:16:3e:7c:96:13 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd50417a4-0ec8-437f-a3e1-7cbc337330f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=136ce0c6-1680-4fc5-9e60-9f8fd75e2728, chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.950 104584 INFO neutron.agent.ovn.metadata.agent [-] Port b7f6b0cd-1f1b-4bb3-abcd-720615d7920a in datapath 93624e72-4991-4db9-bfa5-cd34c3d50e84 bound to our chassis#033[00m
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.951 104584 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 93624e72-4991-4db9-bfa5-cd34c3d50e84#033[00m
Nov 28 20:03:52 np0005539279 systemd-udevd[219987]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.966 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2a59a6cd-5ac5-4933-badc-8ed46308619f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.968 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap93624e72-41 in ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.969 214026 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap93624e72-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.969 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[f9b62daf-75a2-4142-9148-1689fae72fb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.970 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[2e38ac02-ae3a-4e04-9591-7bb8a58e5f36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:52 np0005539279 systemd-machined[153752]: New machine qemu-13-instance-0000000d.
Nov 28 20:03:52 np0005539279 NetworkManager[55703]: <info>  [1764378232.9825] device (tapb7f6b0cd-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 20:03:52 np0005539279 NetworkManager[55703]: <info>  [1764378232.9838] device (tapb7f6b0cd-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 20:03:52 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.987 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[b3efb481-42fc-4f4a-add0-c7e07e92afdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:52.999 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[fb46f734-5c85-40ce-803d-850dd4f056b0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:53Z|00165|binding|INFO|Setting lport b7f6b0cd-1f1b-4bb3-abcd-720615d7920a ovn-installed in OVS
Nov 28 20:03:53 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:53Z|00166|binding|INFO|Setting lport b7f6b0cd-1f1b-4bb3-abcd-720615d7920a up in Southbound
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.003 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:53 np0005539279 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.038 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[c63a770d-1c80-480e-8b37-ce5419c529eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.044 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[1fe9c91e-0b2b-4e96-9b66-4fc4d5fefb19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 systemd-udevd[219992]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 20:03:53 np0005539279 NetworkManager[55703]: <info>  [1764378233.0457] manager: (tap93624e72-40): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.084 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[72138c4d-f6f1-481a-8f04-3c7ef5c409da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.089 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1894a1-8e9e-471a-8746-ec6a1010056f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 NetworkManager[55703]: <info>  [1764378233.1173] device (tap93624e72-40): carrier: link connected
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.122 214042 DEBUG oslo.privsep.daemon [-] privsep: reply[69318440-a163-4a41-acda-d4f69d2ccb8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.144 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[89508ed2-0c5a-4db9-bdc8-fb6cfb52864e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap93624e72-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:eb:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416682, 'reachable_time': 34409, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220021, 'error': None, 'target': 'ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.165 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[29f8c623-90d8-4d84-baf7-2aac205fca02]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:eb45'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 416682, 'tstamp': 416682}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220022, 'error': None, 'target': 'ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.187 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[aae05506-60ce-4fce-9ccb-0f59f8d7cbee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap93624e72-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:eb:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416682, 'reachable_time': 34409, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220023, 'error': None, 'target': 'ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.233 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[74d949d8-f6e7-4297-bf84-27878379dd9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.323 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[55509df6-9855-4e64-9f82-a9720860ae50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.325 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93624e72-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.325 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.326 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93624e72-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:53 np0005539279 kernel: tap93624e72-40: entered promiscuous mode
Nov 28 20:03:53 np0005539279 NetworkManager[55703]: <info>  [1764378233.3318] manager: (tap93624e72-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.332 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap93624e72-40, col_values=(('external_ids', {'iface-id': 'bafc0439-c42b-4910-aba6-e2213547abb5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:03:53 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:53Z|00167|binding|INFO|Releasing lport bafc0439-c42b-4910-aba6-e2213547abb5 from this chassis (sb_readonly=0)
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.334 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.357 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.358 104584 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/93624e72-4991-4db9-bfa5-cd34c3d50e84.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/93624e72-4991-4db9-bfa5-cd34c3d50e84.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.359 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[efa97eb9-38f4-41cc-b874-9c281f6afe08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.360 104584 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: global
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    log         /dev/log local0 debug
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    log-tag     haproxy-metadata-proxy-93624e72-4991-4db9-bfa5-cd34c3d50e84
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    user        root
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    group       root
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    maxconn     1024
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    pidfile     /var/lib/neutron/external/pids/93624e72-4991-4db9-bfa5-cd34c3d50e84.pid.haproxy
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    daemon
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: defaults
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    log global
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    mode http
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    option httplog
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    option dontlognull
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    option http-server-close
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    option forwardfor
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    retries                 3
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    timeout http-request    30s
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    timeout connect         30s
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    timeout client          32s
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    timeout server          32s
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    timeout http-keep-alive 30s
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: listen listener
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    bind 169.254.169.254:80
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]:    http-request add-header X-OVN-Network-ID 93624e72-4991-4db9-bfa5-cd34c3d50e84
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 20:03:53 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:03:53.361 104584 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'env', 'PROCESS_TAG=haproxy-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/93624e72-4991-4db9-bfa5-cd34c3d50e84.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.633 187518 DEBUG nova.compute.manager [req-0e65aeb9-bfa7-4690-a762-aad99988ea33 req-d8e72006-0041-4433-b787-c04baa57f62f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.634 187518 DEBUG oslo_concurrency.lockutils [req-0e65aeb9-bfa7-4690-a762-aad99988ea33 req-d8e72006-0041-4433-b787-c04baa57f62f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.635 187518 DEBUG oslo_concurrency.lockutils [req-0e65aeb9-bfa7-4690-a762-aad99988ea33 req-d8e72006-0041-4433-b787-c04baa57f62f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.635 187518 DEBUG oslo_concurrency.lockutils [req-0e65aeb9-bfa7-4690-a762-aad99988ea33 req-d8e72006-0041-4433-b787-c04baa57f62f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.635 187518 DEBUG nova.compute.manager [req-0e65aeb9-bfa7-4690-a762-aad99988ea33 req-d8e72006-0041-4433-b787-c04baa57f62f 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Processing event network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.785 187518 DEBUG nova.network.neutron [req-157daa45-0c8c-4797-9e18-46a3934f637d req-7ea9a11c-ac13-4c1d-9586-7895a1c81fd7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Updated VIF entry in instance network info cache for port b7f6b0cd-1f1b-4bb3-abcd-720615d7920a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.786 187518 DEBUG nova.network.neutron [req-157daa45-0c8c-4797-9e18-46a3934f637d req-7ea9a11c-ac13-4c1d-9586-7895a1c81fd7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Updating instance_info_cache with network_info: [{"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:53 np0005539279 nova_compute[187514]: 2025-11-29 01:03:53.807 187518 DEBUG oslo_concurrency.lockutils [req-157daa45-0c8c-4797-9e18-46a3934f637d req-7ea9a11c-ac13-4c1d-9586-7895a1c81fd7 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:03:53 np0005539279 podman[220058]: 2025-11-29 01:03:53.865775597 +0000 UTC m=+0.077638095 container create 5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:03:53 np0005539279 podman[220058]: 2025-11-29 01:03:53.820546306 +0000 UTC m=+0.032408894 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 20:03:53 np0005539279 systemd[1]: Started libpod-conmon-5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724.scope.
Nov 28 20:03:53 np0005539279 systemd[1]: Started libcrun container.
Nov 28 20:03:53 np0005539279 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77286c4305292011c630b4feb529d829a58ee9a3ca4382794f32cdad7fc327b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 20:03:53 np0005539279 podman[220058]: 2025-11-29 01:03:53.975640807 +0000 UTC m=+0.187503325 container init 5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:03:53 np0005539279 podman[220058]: 2025-11-29 01:03:53.987153729 +0000 UTC m=+0.199016217 container start 5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 20:03:54 np0005539279 neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84[220073]: [NOTICE]   (220077) : New worker (220079) forked
Nov 28 20:03:54 np0005539279 neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84[220073]: [NOTICE]   (220077) : Loading success.
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.315 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378234.3151777, 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.316 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] VM Started (Lifecycle Event)#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.319 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.324 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.330 187518 INFO nova.virt.libvirt.driver [-] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Instance spawned successfully.#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.331 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.353 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.362 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.369 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.369 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.370 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.371 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.372 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.373 187518 DEBUG nova.virt.libvirt.driver [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.388 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.388 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378234.3153648, 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.389 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] VM Paused (Lifecycle Event)#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.420 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.425 187518 DEBUG nova.virt.driver [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] Emitting event <LifecycleEvent: 1764378234.3231716, 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.425 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] VM Resumed (Lifecycle Event)#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.447 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.452 187518 DEBUG nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.457 187518 INFO nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Took 10.81 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.458 187518 DEBUG nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.475 187518 INFO nova.compute.manager [None req-26a8493e-13d3-4435-8bdd-ada0d63b9784 - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.522 187518 INFO nova.compute.manager [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Took 11.48 seconds to build instance.#033[00m
Nov 28 20:03:54 np0005539279 nova_compute[187514]: 2025-11-29 01:03:54.537 187518 DEBUG oslo_concurrency.lockutils [None req-38e57110-50ad-4e9d-b1b1-acebdef5373e 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.563s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:55 np0005539279 nova_compute[187514]: 2025-11-29 01:03:55.745 187518 DEBUG nova.compute.manager [req-284ccb04-3990-4f70-a963-8c6bebcc7eaa req-1b9dd3f5-8bda-4f1d-b0bc-03b1cab57435 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:55 np0005539279 nova_compute[187514]: 2025-11-29 01:03:55.747 187518 DEBUG oslo_concurrency.lockutils [req-284ccb04-3990-4f70-a963-8c6bebcc7eaa req-1b9dd3f5-8bda-4f1d-b0bc-03b1cab57435 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:03:55 np0005539279 nova_compute[187514]: 2025-11-29 01:03:55.747 187518 DEBUG oslo_concurrency.lockutils [req-284ccb04-3990-4f70-a963-8c6bebcc7eaa req-1b9dd3f5-8bda-4f1d-b0bc-03b1cab57435 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:03:55 np0005539279 nova_compute[187514]: 2025-11-29 01:03:55.748 187518 DEBUG oslo_concurrency.lockutils [req-284ccb04-3990-4f70-a963-8c6bebcc7eaa req-1b9dd3f5-8bda-4f1d-b0bc-03b1cab57435 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:03:55 np0005539279 nova_compute[187514]: 2025-11-29 01:03:55.748 187518 DEBUG nova.compute.manager [req-284ccb04-3990-4f70-a963-8c6bebcc7eaa req-1b9dd3f5-8bda-4f1d-b0bc-03b1cab57435 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] No waiting events found dispatching network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:03:55 np0005539279 nova_compute[187514]: 2025-11-29 01:03:55.749 187518 WARNING nova.compute.manager [req-284ccb04-3990-4f70-a963-8c6bebcc7eaa req-1b9dd3f5-8bda-4f1d-b0bc-03b1cab57435 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received unexpected event network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a for instance with vm_state active and task_state None.#033[00m
Nov 28 20:03:57 np0005539279 NetworkManager[55703]: <info>  [1764378237.0610] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Nov 28 20:03:57 np0005539279 NetworkManager[55703]: <info>  [1764378237.0628] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.063 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:57 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:57Z|00168|binding|INFO|Releasing lport bafc0439-c42b-4910-aba6-e2213547abb5 from this chassis (sb_readonly=0)
Nov 28 20:03:57 np0005539279 ovn_controller[95686]: 2025-11-29T01:03:57Z|00169|binding|INFO|Releasing lport bafc0439-c42b-4910-aba6-e2213547abb5 from this chassis (sb_readonly=0)
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.123 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.131 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.239 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.834 187518 DEBUG nova.compute.manager [req-efbd6dfa-b2ac-42d7-9c41-cdb5de521b46 req-3a443b81-bf3f-4d77-9c7f-30473c004baa 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-changed-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.835 187518 DEBUG nova.compute.manager [req-efbd6dfa-b2ac-42d7-9c41-cdb5de521b46 req-3a443b81-bf3f-4d77-9c7f-30473c004baa 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Refreshing instance network info cache due to event network-changed-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.836 187518 DEBUG oslo_concurrency.lockutils [req-efbd6dfa-b2ac-42d7-9c41-cdb5de521b46 req-3a443b81-bf3f-4d77-9c7f-30473c004baa 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.836 187518 DEBUG oslo_concurrency.lockutils [req-efbd6dfa-b2ac-42d7-9c41-cdb5de521b46 req-3a443b81-bf3f-4d77-9c7f-30473c004baa 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.837 187518 DEBUG nova.network.neutron [req-efbd6dfa-b2ac-42d7-9c41-cdb5de521b46 req-3a443b81-bf3f-4d77-9c7f-30473c004baa 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Refreshing network info cache for port b7f6b0cd-1f1b-4bb3-abcd-720615d7920a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:03:57 np0005539279 nova_compute[187514]: 2025-11-29 01:03:57.994 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:03:58 np0005539279 nova_compute[187514]: 2025-11-29 01:03:58.947 187518 DEBUG nova.network.neutron [req-efbd6dfa-b2ac-42d7-9c41-cdb5de521b46 req-3a443b81-bf3f-4d77-9c7f-30473c004baa 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Updated VIF entry in instance network info cache for port b7f6b0cd-1f1b-4bb3-abcd-720615d7920a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:03:58 np0005539279 nova_compute[187514]: 2025-11-29 01:03:58.949 187518 DEBUG nova.network.neutron [req-efbd6dfa-b2ac-42d7-9c41-cdb5de521b46 req-3a443b81-bf3f-4d77-9c7f-30473c004baa 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Updating instance_info_cache with network_info: [{"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:03:58 np0005539279 nova_compute[187514]: 2025-11-29 01:03:58.971 187518 DEBUG oslo_concurrency.lockutils [req-efbd6dfa-b2ac-42d7-9c41-cdb5de521b46 req-3a443b81-bf3f-4d77-9c7f-30473c004baa 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:04:02 np0005539279 nova_compute[187514]: 2025-11-29 01:04:02.245 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:02 np0005539279 nova_compute[187514]: 2025-11-29 01:04:02.998 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:04 np0005539279 podman[220100]: 2025-11-29 01:04:04.852672707 +0000 UTC m=+0.082205906 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 20:04:04 np0005539279 podman[220099]: 2025-11-29 01:04:04.85727898 +0000 UTC m=+0.083151403 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git)
Nov 28 20:04:06 np0005539279 ovn_controller[95686]: 2025-11-29T01:04:06Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:96:13 10.100.0.4
Nov 28 20:04:06 np0005539279 ovn_controller[95686]: 2025-11-29T01:04:06Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:96:13 10.100.0.4
Nov 28 20:04:07 np0005539279 nova_compute[187514]: 2025-11-29 01:04:07.252 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:08 np0005539279 nova_compute[187514]: 2025-11-29 01:04:08.043 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:08.098 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:04:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:08.099 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:04:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:08.100 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:04:09 np0005539279 podman[220165]: 2025-11-29 01:04:09.852157281 +0000 UTC m=+0.087335844 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 28 20:04:09 np0005539279 podman[220164]: 2025-11-29 01:04:09.915690278 +0000 UTC m=+0.151490559 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:04:12 np0005539279 nova_compute[187514]: 2025-11-29 01:04:12.243 187518 INFO nova.compute.manager [None req-d2b6bcf4-db34-4c9b-b9a3-e1bbff1a57fd 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Get console output#033[00m
Nov 28 20:04:12 np0005539279 nova_compute[187514]: 2025-11-29 01:04:12.250 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 20:04:12 np0005539279 nova_compute[187514]: 2025-11-29 01:04:12.255 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:13 np0005539279 nova_compute[187514]: 2025-11-29 01:04:13.046 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:16 np0005539279 ovn_controller[95686]: 2025-11-29T01:04:16Z|00170|binding|INFO|Releasing lport bafc0439-c42b-4910-aba6-e2213547abb5 from this chassis (sb_readonly=0)
Nov 28 20:04:16 np0005539279 nova_compute[187514]: 2025-11-29 01:04:16.554 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:16 np0005539279 ovn_controller[95686]: 2025-11-29T01:04:16Z|00171|binding|INFO|Releasing lport bafc0439-c42b-4910-aba6-e2213547abb5 from this chassis (sb_readonly=0)
Nov 28 20:04:16 np0005539279 nova_compute[187514]: 2025-11-29 01:04:16.566 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:17 np0005539279 nova_compute[187514]: 2025-11-29 01:04:17.258 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:17 np0005539279 nova_compute[187514]: 2025-11-29 01:04:17.814 187518 INFO nova.compute.manager [None req-cc8e6d3e-cb94-4827-b61f-03c7d95be296 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Get console output#033[00m
Nov 28 20:04:17 np0005539279 nova_compute[187514]: 2025-11-29 01:04:17.820 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 20:04:18 np0005539279 nova_compute[187514]: 2025-11-29 01:04:18.049 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:19 np0005539279 nova_compute[187514]: 2025-11-29 01:04:19.493 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:19 np0005539279 NetworkManager[55703]: <info>  [1764378259.4950] manager: (patch-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Nov 28 20:04:19 np0005539279 NetworkManager[55703]: <info>  [1764378259.4971] manager: (patch-br-int-to-provnet-878cd655-e093-45c4-8d3c-a47a2c76b518): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 28 20:04:19 np0005539279 nova_compute[187514]: 2025-11-29 01:04:19.576 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:19 np0005539279 ovn_controller[95686]: 2025-11-29T01:04:19Z|00172|binding|INFO|Releasing lport bafc0439-c42b-4910-aba6-e2213547abb5 from this chassis (sb_readonly=0)
Nov 28 20:04:19 np0005539279 nova_compute[187514]: 2025-11-29 01:04:19.583 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:19 np0005539279 nova_compute[187514]: 2025-11-29 01:04:19.878 187518 INFO nova.compute.manager [None req-d27a8c82-d1d7-450e-b07a-62a97d4d15fb 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Get console output#033[00m
Nov 28 20:04:19 np0005539279 nova_compute[187514]: 2025-11-29 01:04:19.885 213861 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 20:04:20 np0005539279 nova_compute[187514]: 2025-11-29 01:04:20.997 187518 DEBUG nova.compute.manager [req-70c5b0c3-13c3-4586-9531-704786718ae4 req-afc8038b-7cfd-4838-9381-8f53a8214a37 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-changed-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:04:20 np0005539279 nova_compute[187514]: 2025-11-29 01:04:20.998 187518 DEBUG nova.compute.manager [req-70c5b0c3-13c3-4586-9531-704786718ae4 req-afc8038b-7cfd-4838-9381-8f53a8214a37 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Refreshing instance network info cache due to event network-changed-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 20:04:20 np0005539279 nova_compute[187514]: 2025-11-29 01:04:20.999 187518 DEBUG oslo_concurrency.lockutils [req-70c5b0c3-13c3-4586-9531-704786718ae4 req-afc8038b-7cfd-4838-9381-8f53a8214a37 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:20.999 187518 DEBUG oslo_concurrency.lockutils [req-70c5b0c3-13c3-4586-9531-704786718ae4 req-afc8038b-7cfd-4838-9381-8f53a8214a37 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquired lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.000 187518 DEBUG nova.network.neutron [req-70c5b0c3-13c3-4586-9531-704786718ae4 req-afc8038b-7cfd-4838-9381-8f53a8214a37 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Refreshing network info cache for port b7f6b0cd-1f1b-4bb3-abcd-720615d7920a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.066 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.066 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.067 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.067 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.067 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.068 187518 INFO nova.compute.manager [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Terminating instance#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.069 187518 DEBUG nova.compute.manager [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 20:04:21 np0005539279 kernel: tapb7f6b0cd-1f (unregistering): left promiscuous mode
Nov 28 20:04:21 np0005539279 NetworkManager[55703]: <info>  [1764378261.0905] device (tapb7f6b0cd-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 20:04:21 np0005539279 ovn_controller[95686]: 2025-11-29T01:04:21Z|00173|binding|INFO|Releasing lport b7f6b0cd-1f1b-4bb3-abcd-720615d7920a from this chassis (sb_readonly=0)
Nov 28 20:04:21 np0005539279 ovn_controller[95686]: 2025-11-29T01:04:21Z|00174|binding|INFO|Setting lport b7f6b0cd-1f1b-4bb3-abcd-720615d7920a down in Southbound
Nov 28 20:04:21 np0005539279 ovn_controller[95686]: 2025-11-29T01:04:21Z|00175|binding|INFO|Removing iface tapb7f6b0cd-1f ovn-installed in OVS
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.107 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.115 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:96:13 10.100.0.4'], port_security=['fa:16:3e:7c:96:13 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd50417a4-0ec8-437f-a3e1-7cbc337330f3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=136ce0c6-1680-4fc5-9e60-9f8fd75e2728, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>], logical_port=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f6c87ca86a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.117 104584 INFO neutron.agent.ovn.metadata.agent [-] Port b7f6b0cd-1f1b-4bb3-abcd-720615d7920a in datapath 93624e72-4991-4db9-bfa5-cd34c3d50e84 unbound from our chassis#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.118 104584 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 93624e72-4991-4db9-bfa5-cd34c3d50e84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.119 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[1478fe67-9a60-40ca-9a36-d17decb40d64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.120 104584 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84 namespace which is not needed anymore#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.138 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:21 np0005539279 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 28 20:04:21 np0005539279 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 13.886s CPU time.
Nov 28 20:04:21 np0005539279 systemd-machined[153752]: Machine qemu-13-instance-0000000d terminated.
Nov 28 20:04:21 np0005539279 neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84[220073]: [NOTICE]   (220077) : haproxy version is 2.8.14-c23fe91
Nov 28 20:04:21 np0005539279 neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84[220073]: [NOTICE]   (220077) : path to executable is /usr/sbin/haproxy
Nov 28 20:04:21 np0005539279 neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84[220073]: [WARNING]  (220077) : Exiting Master process...
Nov 28 20:04:21 np0005539279 neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84[220073]: [ALERT]    (220077) : Current worker (220079) exited with code 143 (Terminated)
Nov 28 20:04:21 np0005539279 neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84[220073]: [WARNING]  (220077) : All workers exited. Exiting... (0)
Nov 28 20:04:21 np0005539279 systemd[1]: libpod-5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724.scope: Deactivated successfully.
Nov 28 20:04:21 np0005539279 podman[220235]: 2025-11-29 01:04:21.277794533 +0000 UTC m=+0.052439079 container died 5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.298 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.307 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:21 np0005539279 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724-userdata-shm.mount: Deactivated successfully.
Nov 28 20:04:21 np0005539279 systemd[1]: var-lib-containers-storage-overlay-77286c4305292011c630b4feb529d829a58ee9a3ca4382794f32cdad7fc327b3-merged.mount: Deactivated successfully.
Nov 28 20:04:21 np0005539279 podman[220235]: 2025-11-29 01:04:21.343060981 +0000 UTC m=+0.117705527 container cleanup 5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.351 187518 INFO nova.virt.libvirt.driver [-] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Instance destroyed successfully.#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.351 187518 DEBUG nova.objects.instance [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lazy-loading 'resources' on Instance uuid 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 20:04:21 np0005539279 systemd[1]: libpod-conmon-5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724.scope: Deactivated successfully.
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.364 187518 DEBUG nova.virt.libvirt.vif [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-29T01:03:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1352240053',display_name='tempest-TestNetworkBasicOps-server-1352240053',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1352240053',id=13,image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIbqU/+Ly64ZgevQM1MV/N1d9/dd7HZ9Go69ePa6+HhFuPM6aom4NhK8S663PTiK4LYIxQ4b42dznmCYdks0H80tHFS0HWc4yknTnsj0Te+vyv49ABIJi5J78rVHIlIZqw==',key_name='tempest-TestNetworkBasicOps-1947031926',keypairs=<?>,launch_index=0,launched_at=2025-11-29T01:03:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0df0de37c7d74836a2135b0d6ff3a067',ramdisk_id='',reservation_id='r-nx2cxbr5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='017f04d5-006e-46df-a06f-ac852f70dddf',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1523736817',owner_user_name='tempest-TestNetworkBasicOps-1523736817-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-29T01:03:54Z,user_data=None,user_id='1680be98de9e48a19f46eb0bbdfec6fa',uuid=2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.365 187518 DEBUG nova.network.os_vif_util [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converting VIF {"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.366 187518 DEBUG nova.network.os_vif_util [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:96:13,bridge_name='br-int',has_traffic_filtering=True,id=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a,network=Network(93624e72-4991-4db9-bfa5-cd34c3d50e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f6b0cd-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.366 187518 DEBUG os_vif [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:96:13,bridge_name='br-int',has_traffic_filtering=True,id=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a,network=Network(93624e72-4991-4db9-bfa5-cd34c3d50e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f6b0cd-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.368 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.368 187518 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7f6b0cd-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.370 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.372 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.378 187518 INFO os_vif [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:96:13,bridge_name='br-int',has_traffic_filtering=True,id=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a,network=Network(93624e72-4991-4db9-bfa5-cd34c3d50e84),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7f6b0cd-1f')#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.379 187518 INFO nova.virt.libvirt.driver [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Deleting instance files /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502_del#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.380 187518 INFO nova.virt.libvirt.driver [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Deletion of /var/lib/nova/instances/2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502_del complete#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.431 187518 INFO nova.compute.manager [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.432 187518 DEBUG oslo.service.loopingcall [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.434 187518 DEBUG nova.compute.manager [-] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.434 187518 DEBUG nova.network.neutron [-] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 20:04:21 np0005539279 podman[220283]: 2025-11-29 01:04:21.467773049 +0000 UTC m=+0.078465319 container remove 5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.476 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[6b950bef-8693-433f-b446-7c5d0b80a170]: (4, ('Sat Nov 29 01:04:21 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84 (5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724)\n5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724\nSat Nov 29 01:04:21 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84 (5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724)\n5d50f36b6ee4c27576b45cf56845d8ac4b54e06d3f2346b393b77032727f2724\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.479 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[78875d7f-989a-419b-9c5e-b050b5ea81f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.481 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93624e72-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.483 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:21 np0005539279 kernel: tap93624e72-40: left promiscuous mode
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.508 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.513 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[03cfaef2-e3ce-4f5d-8547-2d305f7f75ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.531 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[c55dc67d-127c-4d13-934c-047ba7908b07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.533 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[24a9d7f4-1c49-4b78-b650-2acaf302a764]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.561 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[dbf21e57-d31b-4a38-bb76-aed29cbab23a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 416673, 'reachable_time': 29427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220298, 'error': None, 'target': 'ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:21 np0005539279 systemd[1]: run-netns-ovnmeta\x2d93624e72\x2d4991\x2d4db9\x2dbfa5\x2dcd34c3d50e84.mount: Deactivated successfully.
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.566 104698 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 20:04:21 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:21.566 104698 DEBUG oslo.privsep.daemon [-] privsep: reply[0939fc39-7712-42ef-ac6c-58f41e98d656]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.670 187518 DEBUG nova.compute.manager [req-0ec786b7-88a1-46f1-9c3a-b1083838711a req-a4269c94-41e7-48ee-9baa-f6dea3980d6c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-vif-unplugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.671 187518 DEBUG oslo_concurrency.lockutils [req-0ec786b7-88a1-46f1-9c3a-b1083838711a req-a4269c94-41e7-48ee-9baa-f6dea3980d6c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.671 187518 DEBUG oslo_concurrency.lockutils [req-0ec786b7-88a1-46f1-9c3a-b1083838711a req-a4269c94-41e7-48ee-9baa-f6dea3980d6c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.672 187518 DEBUG oslo_concurrency.lockutils [req-0ec786b7-88a1-46f1-9c3a-b1083838711a req-a4269c94-41e7-48ee-9baa-f6dea3980d6c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.672 187518 DEBUG nova.compute.manager [req-0ec786b7-88a1-46f1-9c3a-b1083838711a req-a4269c94-41e7-48ee-9baa-f6dea3980d6c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] No waiting events found dispatching network-vif-unplugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:04:21 np0005539279 nova_compute[187514]: 2025-11-29 01:04:21.673 187518 DEBUG nova.compute.manager [req-0ec786b7-88a1-46f1-9c3a-b1083838711a req-a4269c94-41e7-48ee-9baa-f6dea3980d6c 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-vif-unplugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 20:04:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:22.121 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.126 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:22.125 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 20:04:22 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:22.127 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.671 187518 DEBUG nova.network.neutron [-] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.675 187518 DEBUG nova.network.neutron [req-70c5b0c3-13c3-4586-9531-704786718ae4 req-afc8038b-7cfd-4838-9381-8f53a8214a37 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Updated VIF entry in instance network info cache for port b7f6b0cd-1f1b-4bb3-abcd-720615d7920a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.676 187518 DEBUG nova.network.neutron [req-70c5b0c3-13c3-4586-9531-704786718ae4 req-afc8038b-7cfd-4838-9381-8f53a8214a37 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Updating instance_info_cache with network_info: [{"id": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "address": "fa:16:3e:7c:96:13", "network": {"id": "93624e72-4991-4db9-bfa5-cd34c3d50e84", "bridge": "br-int", "label": "tempest-network-smoke--1175002114", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0df0de37c7d74836a2135b0d6ff3a067", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7f6b0cd-1f", "ovs_interfaceid": "b7f6b0cd-1f1b-4bb3-abcd-720615d7920a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.706 187518 INFO nova.compute.manager [-] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Took 1.27 seconds to deallocate network for instance.#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.718 187518 DEBUG oslo_concurrency.lockutils [req-70c5b0c3-13c3-4586-9531-704786718ae4 req-afc8038b-7cfd-4838-9381-8f53a8214a37 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Releasing lock "refresh_cache-2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.774 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.775 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.856 187518 DEBUG nova.compute.provider_tree [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.892 187518 DEBUG nova.scheduler.client.report [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.929 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:04:22 np0005539279 nova_compute[187514]: 2025-11-29 01:04:22.990 187518 INFO nova.scheduler.client.report [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Deleted allocations for instance 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.052 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.072 187518 DEBUG oslo_concurrency.lockutils [None req-73998f31-46aa-4379-9caf-9d1e3da47c37 1680be98de9e48a19f46eb0bbdfec6fa 0df0de37c7d74836a2135b0d6ff3a067 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.786 187518 DEBUG nova.compute.manager [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.787 187518 DEBUG oslo_concurrency.lockutils [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Acquiring lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.788 187518 DEBUG oslo_concurrency.lockutils [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.788 187518 DEBUG oslo_concurrency.lockutils [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] Lock "2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.789 187518 DEBUG nova.compute.manager [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] No waiting events found dispatching network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.789 187518 WARNING nova.compute.manager [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received unexpected event network-vif-plugged-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a for instance with vm_state deleted and task_state None.#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.790 187518 DEBUG nova.compute.manager [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Received event network-vif-deleted-b7f6b0cd-1f1b-4bb3-abcd-720615d7920a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.790 187518 INFO nova.compute.manager [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Neutron deleted interface b7f6b0cd-1f1b-4bb3-abcd-720615d7920a; detaching it from the instance and deleting it from the info cache#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.791 187518 DEBUG nova.network.neutron [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Nov 28 20:04:23 np0005539279 nova_compute[187514]: 2025-11-29 01:04:23.795 187518 DEBUG nova.compute.manager [req-8d58016f-27c5-4830-8eec-dafbbe070f7f req-64e9602d-4c16-48e5-b00b-51a450aed698 3fe22becc02b42088afdcc96ac08ed99 6529723595344362a859e1e0fa809405 - - default default] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Detach interface failed, port_id=b7f6b0cd-1f1b-4bb3-abcd-720615d7920a, reason: Instance 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 28 20:04:23 np0005539279 podman[220300]: 2025-11-29 01:04:23.880879339 +0000 UTC m=+0.109544783 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 20:04:23 np0005539279 podman[220299]: 2025-11-29 01:04:23.881129506 +0000 UTC m=+0.119365685 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 20:04:23 np0005539279 podman[220301]: 2025-11-29 01:04:23.885993406 +0000 UTC m=+0.113024623 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.633 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.634 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.634 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.635 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.871 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.874 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5736MB free_disk=73.33920288085938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.875 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.875 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.965 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.966 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:04:25 np0005539279 nova_compute[187514]: 2025-11-29 01:04:25.991 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing inventories for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.012 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating ProviderTree inventory for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.012 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.030 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing aggregate associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.063 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing trait associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, traits: COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,HW_CPU_X86_SSE,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.089 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.106 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.132 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.133 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.258s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:04:26 np0005539279 nova_compute[187514]: 2025-11-29 01:04:26.371 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:28 np0005539279 nova_compute[187514]: 2025-11-29 01:04:28.085 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:29 np0005539279 nova_compute[187514]: 2025-11-29 01:04:29.132 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:29 np0005539279 nova_compute[187514]: 2025-11-29 01:04:29.133 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:04:29 np0005539279 nova_compute[187514]: 2025-11-29 01:04:29.133 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:04:29 np0005539279 nova_compute[187514]: 2025-11-29 01:04:29.156 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:04:29 np0005539279 nova_compute[187514]: 2025-11-29 01:04:29.156 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:30 np0005539279 nova_compute[187514]: 2025-11-29 01:04:30.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:30 np0005539279 nova_compute[187514]: 2025-11-29 01:04:30.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:04:31 np0005539279 nova_compute[187514]: 2025-11-29 01:04:31.374 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:32 np0005539279 nova_compute[187514]: 2025-11-29 01:04:32.281 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:32 np0005539279 nova_compute[187514]: 2025-11-29 01:04:32.378 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:32.460 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:eb:45'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=136ce0c6-1680-4fc5-9e60-9f8fd75e2728, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bafc0439-c42b-4910-aba6-e2213547abb5) old=Port_Binding(mac=['fa:16:3e:8e:eb:45 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93624e72-4991-4db9-bfa5-cd34c3d50e84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0df0de37c7d74836a2135b0d6ff3a067', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:04:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:32.462 104584 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bafc0439-c42b-4910-aba6-e2213547abb5 in datapath 93624e72-4991-4db9-bfa5-cd34c3d50e84 updated#033[00m
Nov 28 20:04:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:32.464 104584 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 93624e72-4991-4db9-bfa5-cd34c3d50e84 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Nov 28 20:04:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:04:32.465 214026 DEBUG oslo.privsep.daemon [-] privsep: reply[966c4941-ecb2-4d69-94b3-2e6d5f399f84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 20:04:33 np0005539279 nova_compute[187514]: 2025-11-29 01:04:33.088 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:33 np0005539279 nova_compute[187514]: 2025-11-29 01:04:33.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:34 np0005539279 nova_compute[187514]: 2025-11-29 01:04:34.631 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:35 np0005539279 nova_compute[187514]: 2025-11-29 01:04:35.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:35 np0005539279 nova_compute[187514]: 2025-11-29 01:04:35.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:35 np0005539279 podman[220364]: 2025-11-29 01:04:35.85951825 +0000 UTC m=+0.091260157 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 20:04:35 np0005539279 podman[220363]: 2025-11-29 01:04:35.869390454 +0000 UTC m=+0.103981713 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=edpm, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 20:04:36 np0005539279 nova_compute[187514]: 2025-11-29 01:04:36.347 187518 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764378261.345888, 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 20:04:36 np0005539279 nova_compute[187514]: 2025-11-29 01:04:36.348 187518 INFO nova.compute.manager [-] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] VM Stopped (Lifecycle Event)#033[00m
Nov 28 20:04:36 np0005539279 nova_compute[187514]: 2025-11-29 01:04:36.390 187518 DEBUG nova.compute.manager [None req-c7ccc530-5a02-4bed-bd27-91c845a9a48d - - - - - -] [instance: 2fd80c4c-8c3d-40c5-ae93-7bbbccc3b502] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 20:04:36 np0005539279 nova_compute[187514]: 2025-11-29 01:04:36.417 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:38 np0005539279 nova_compute[187514]: 2025-11-29 01:04:38.090 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:39 np0005539279 nova_compute[187514]: 2025-11-29 01:04:39.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:04:40 np0005539279 podman[220409]: 2025-11-29 01:04:40.850530881 +0000 UTC m=+0.081403443 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 20:04:40 np0005539279 podman[220408]: 2025-11-29 01:04:40.906888472 +0000 UTC m=+0.142449039 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 20:04:41 np0005539279 nova_compute[187514]: 2025-11-29 01:04:41.420 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:43 np0005539279 nova_compute[187514]: 2025-11-29 01:04:43.144 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:46 np0005539279 nova_compute[187514]: 2025-11-29 01:04:46.423 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:48 np0005539279 nova_compute[187514]: 2025-11-29 01:04:48.193 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:51 np0005539279 nova_compute[187514]: 2025-11-29 01:04:51.431 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:53 np0005539279 nova_compute[187514]: 2025-11-29 01:04:53.194 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:54 np0005539279 podman[220461]: 2025-11-29 01:04:54.879318349 +0000 UTC m=+0.099391560 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 20:04:54 np0005539279 podman[220460]: 2025-11-29 01:04:54.888504724 +0000 UTC m=+0.117731558 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 20:04:54 np0005539279 podman[220462]: 2025-11-29 01:04:54.891121879 +0000 UTC m=+0.109355547 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 20:04:56 np0005539279 nova_compute[187514]: 2025-11-29 01:04:56.440 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:04:58 np0005539279 nova_compute[187514]: 2025-11-29 01:04:58.237 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:01 np0005539279 nova_compute[187514]: 2025-11-29 01:05:01.445 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:02 np0005539279 ovn_controller[95686]: 2025-11-29T01:05:02Z|00176|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Nov 28 20:05:03 np0005539279 nova_compute[187514]: 2025-11-29 01:05:03.239 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:06 np0005539279 nova_compute[187514]: 2025-11-29 01:05:06.452 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:06 np0005539279 podman[220527]: 2025-11-29 01:05:06.866069183 +0000 UTC m=+0.093199912 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 20:05:06 np0005539279 podman[220526]: 2025-11-29 01:05:06.893629356 +0000 UTC m=+0.129105205 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 20:05:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:05:08.099 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:05:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:05:08.099 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:05:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:05:08.099 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:05:08 np0005539279 nova_compute[187514]: 2025-11-29 01:05:08.241 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:11 np0005539279 nova_compute[187514]: 2025-11-29 01:05:11.455 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:11 np0005539279 podman[220576]: 2025-11-29 01:05:11.83715975 +0000 UTC m=+0.079091726 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:05:11 np0005539279 podman[220575]: 2025-11-29 01:05:11.895830198 +0000 UTC m=+0.141968705 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 20:05:13 np0005539279 nova_compute[187514]: 2025-11-29 01:05:13.244 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:16 np0005539279 nova_compute[187514]: 2025-11-29 01:05:16.457 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:18 np0005539279 nova_compute[187514]: 2025-11-29 01:05:18.247 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:21 np0005539279 nova_compute[187514]: 2025-11-29 01:05:21.460 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:23 np0005539279 nova_compute[187514]: 2025-11-29 01:05:23.249 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:23 np0005539279 systemd-logind[811]: New session 27 of user zuul.
Nov 28 20:05:23 np0005539279 systemd[1]: Started Session 27 of User zuul.
Nov 28 20:05:25 np0005539279 podman[220663]: 2025-11-29 01:05:25.179908834 +0000 UTC m=+0.076880453 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 20:05:25 np0005539279 podman[220664]: 2025-11-29 01:05:25.207158708 +0000 UTC m=+0.097254619 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 20:05:25 np0005539279 podman[220662]: 2025-11-29 01:05:25.238406066 +0000 UTC m=+0.129257079 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 28 20:05:25 np0005539279 nova_compute[187514]: 2025-11-29 01:05:25.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:26 np0005539279 nova_compute[187514]: 2025-11-29 01:05:26.463 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:26 np0005539279 nova_compute[187514]: 2025-11-29 01:05:26.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.248 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.249 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.249 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.250 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.485 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.487 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5645MB free_disk=73.33893966674805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.487 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.488 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.571 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.571 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.597 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.613 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.616 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:05:27 np0005539279 nova_compute[187514]: 2025-11-29 01:05:27.616 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:05:28 np0005539279 nova_compute[187514]: 2025-11-29 01:05:28.280 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:29 np0005539279 ovs-vsctl[220859]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 28 20:05:29 np0005539279 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 220652 (sos)
Nov 28 20:05:29 np0005539279 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 28 20:05:29 np0005539279 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 28 20:05:30 np0005539279 virtqemud[187089]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 28 20:05:30 np0005539279 virtqemud[187089]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 28 20:05:30 np0005539279 virtqemud[187089]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 20:05:30 np0005539279 nova_compute[187514]: 2025-11-29 01:05:30.617 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:30 np0005539279 nova_compute[187514]: 2025-11-29 01:05:30.619 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:05:30 np0005539279 nova_compute[187514]: 2025-11-29 01:05:30.619 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:05:30 np0005539279 nova_compute[187514]: 2025-11-29 01:05:30.647 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:05:30 np0005539279 nova_compute[187514]: 2025-11-29 01:05:30.648 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:31 np0005539279 nova_compute[187514]: 2025-11-29 01:05:31.464 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:31 np0005539279 nova_compute[187514]: 2025-11-29 01:05:31.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:31 np0005539279 nova_compute[187514]: 2025-11-29 01:05:31.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:05:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:05:33 np0005539279 nova_compute[187514]: 2025-11-29 01:05:33.280 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:34 np0005539279 systemd[1]: Starting Hostname Service...
Nov 28 20:05:34 np0005539279 systemd[1]: Started Hostname Service.
Nov 28 20:05:35 np0005539279 nova_compute[187514]: 2025-11-29 01:05:35.604 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:36 np0005539279 nova_compute[187514]: 2025-11-29 01:05:36.467 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:36 np0005539279 nova_compute[187514]: 2025-11-29 01:05:36.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:37 np0005539279 nova_compute[187514]: 2025-11-29 01:05:37.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:37 np0005539279 podman[221751]: 2025-11-29 01:05:37.856375719 +0000 UTC m=+0.087708987 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 20:05:37 np0005539279 podman[221748]: 2025-11-29 01:05:37.872615226 +0000 UTC m=+0.111488689 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Nov 28 20:05:38 np0005539279 nova_compute[187514]: 2025-11-29 01:05:38.282 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:40 np0005539279 nova_compute[187514]: 2025-11-29 01:05:40.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:05:41 np0005539279 ovs-appctl[222456]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 28 20:05:41 np0005539279 ovs-appctl[222462]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 28 20:05:41 np0005539279 ovs-appctl[222469]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 28 20:05:41 np0005539279 nova_compute[187514]: 2025-11-29 01:05:41.469 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:42 np0005539279 podman[222930]: 2025-11-29 01:05:42.823946332 +0000 UTC m=+0.067700692 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 20:05:42 np0005539279 podman[222926]: 2025-11-29 01:05:42.891525861 +0000 UTC m=+0.129665321 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:05:43 np0005539279 nova_compute[187514]: 2025-11-29 01:05:43.332 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:46 np0005539279 nova_compute[187514]: 2025-11-29 01:05:46.472 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:48 np0005539279 nova_compute[187514]: 2025-11-29 01:05:48.367 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:50 np0005539279 virtqemud[187089]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 20:05:51 np0005539279 nova_compute[187514]: 2025-11-29 01:05:51.479 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:52 np0005539279 systemd[1]: Starting Time & Date Service...
Nov 28 20:05:53 np0005539279 systemd[1]: Started Time & Date Service.
Nov 28 20:05:53 np0005539279 nova_compute[187514]: 2025-11-29 01:05:53.377 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:55 np0005539279 podman[224059]: 2025-11-29 01:05:55.875470026 +0000 UTC m=+0.101116418 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:05:55 np0005539279 podman[224058]: 2025-11-29 01:05:55.887115446 +0000 UTC m=+0.116389692 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 20:05:55 np0005539279 podman[224060]: 2025-11-29 01:05:55.895546922 +0000 UTC m=+0.116118266 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 20:05:56 np0005539279 nova_compute[187514]: 2025-11-29 01:05:56.483 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:05:58 np0005539279 nova_compute[187514]: 2025-11-29 01:05:58.380 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:01 np0005539279 nova_compute[187514]: 2025-11-29 01:06:01.488 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:03 np0005539279 nova_compute[187514]: 2025-11-29 01:06:03.385 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:06 np0005539279 nova_compute[187514]: 2025-11-29 01:06:06.496 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:06:08.099 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:06:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:06:08.100 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:06:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:06:08.100 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:06:08 np0005539279 nova_compute[187514]: 2025-11-29 01:06:08.385 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:08 np0005539279 podman[224119]: 2025-11-29 01:06:08.84905934 +0000 UTC m=+0.085284640 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 20:06:08 np0005539279 podman[224118]: 2025-11-29 01:06:08.882003375 +0000 UTC m=+0.117597411 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Nov 28 20:06:11 np0005539279 nova_compute[187514]: 2025-11-29 01:06:11.499 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:12 np0005539279 systemd[1]: session-27.scope: Deactivated successfully.
Nov 28 20:06:12 np0005539279 systemd[1]: session-27.scope: Consumed 1min 25.348s CPU time, 490.2M memory peak, read 101.1M from disk, written 15.1M to disk.
Nov 28 20:06:12 np0005539279 systemd-logind[811]: Session 27 logged out. Waiting for processes to exit.
Nov 28 20:06:12 np0005539279 systemd-logind[811]: Removed session 27.
Nov 28 20:06:12 np0005539279 systemd-logind[811]: New session 28 of user zuul.
Nov 28 20:06:12 np0005539279 systemd[1]: Started Session 28 of User zuul.
Nov 28 20:06:12 np0005539279 systemd[1]: session-28.scope: Deactivated successfully.
Nov 28 20:06:12 np0005539279 systemd-logind[811]: Session 28 logged out. Waiting for processes to exit.
Nov 28 20:06:12 np0005539279 systemd-logind[811]: Removed session 28.
Nov 28 20:06:13 np0005539279 nova_compute[187514]: 2025-11-29 01:06:13.390 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:13 np0005539279 systemd-logind[811]: New session 29 of user zuul.
Nov 28 20:06:13 np0005539279 systemd[1]: Started Session 29 of User zuul.
Nov 28 20:06:13 np0005539279 podman[224197]: 2025-11-29 01:06:13.607015519 +0000 UTC m=+0.068899500 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 20:06:13 np0005539279 podman[224195]: 2025-11-29 01:06:13.699412014 +0000 UTC m=+0.155581932 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller)
Nov 28 20:06:13 np0005539279 systemd[1]: session-29.scope: Deactivated successfully.
Nov 28 20:06:13 np0005539279 systemd-logind[811]: Session 29 logged out. Waiting for processes to exit.
Nov 28 20:06:13 np0005539279 systemd-logind[811]: Removed session 29.
Nov 28 20:06:16 np0005539279 nova_compute[187514]: 2025-11-29 01:06:16.501 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:18 np0005539279 nova_compute[187514]: 2025-11-29 01:06:18.437 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:21 np0005539279 nova_compute[187514]: 2025-11-29 01:06:21.504 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:23 np0005539279 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 20:06:23 np0005539279 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 20:06:23 np0005539279 nova_compute[187514]: 2025-11-29 01:06:23.440 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.506 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.653 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.654 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.654 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.654 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:06:26 np0005539279 podman[224277]: 2025-11-29 01:06:26.856917898 +0000 UTC m=+0.087465521 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 20:06:26 np0005539279 podman[224276]: 2025-11-29 01:06:26.858056215 +0000 UTC m=+0.084991624 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 20:06:26 np0005539279 podman[224275]: 2025-11-29 01:06:26.903369717 +0000 UTC m=+0.134531294 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.972 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.974 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5712MB free_disk=73.3365592956543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.975 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:06:26 np0005539279 nova_compute[187514]: 2025-11-29 01:06:26.975 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:06:27 np0005539279 nova_compute[187514]: 2025-11-29 01:06:27.066 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:06:27 np0005539279 nova_compute[187514]: 2025-11-29 01:06:27.067 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:06:27 np0005539279 nova_compute[187514]: 2025-11-29 01:06:27.104 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:06:27 np0005539279 nova_compute[187514]: 2025-11-29 01:06:27.134 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:06:27 np0005539279 nova_compute[187514]: 2025-11-29 01:06:27.136 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:06:27 np0005539279 nova_compute[187514]: 2025-11-29 01:06:27.136 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:06:28 np0005539279 nova_compute[187514]: 2025-11-29 01:06:28.136 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:28 np0005539279 nova_compute[187514]: 2025-11-29 01:06:28.443 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:30 np0005539279 nova_compute[187514]: 2025-11-29 01:06:30.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:31 np0005539279 nova_compute[187514]: 2025-11-29 01:06:31.509 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:31 np0005539279 nova_compute[187514]: 2025-11-29 01:06:31.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:31 np0005539279 nova_compute[187514]: 2025-11-29 01:06:31.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:06:32 np0005539279 nova_compute[187514]: 2025-11-29 01:06:32.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:32 np0005539279 nova_compute[187514]: 2025-11-29 01:06:32.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:06:32 np0005539279 nova_compute[187514]: 2025-11-29 01:06:32.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:06:32 np0005539279 nova_compute[187514]: 2025-11-29 01:06:32.625 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:06:33 np0005539279 nova_compute[187514]: 2025-11-29 01:06:33.486 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:36 np0005539279 nova_compute[187514]: 2025-11-29 01:06:36.510 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:36 np0005539279 nova_compute[187514]: 2025-11-29 01:06:36.618 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:36 np0005539279 nova_compute[187514]: 2025-11-29 01:06:36.619 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:37 np0005539279 nova_compute[187514]: 2025-11-29 01:06:37.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:38 np0005539279 nova_compute[187514]: 2025-11-29 01:06:38.538 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:39 np0005539279 nova_compute[187514]: 2025-11-29 01:06:39.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:39 np0005539279 podman[224341]: 2025-11-29 01:06:39.829268414 +0000 UTC m=+0.064055518 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 20:06:39 np0005539279 podman[224340]: 2025-11-29 01:06:39.847361424 +0000 UTC m=+0.079259931 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc.)
Nov 28 20:06:41 np0005539279 nova_compute[187514]: 2025-11-29 01:06:41.513 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:42 np0005539279 nova_compute[187514]: 2025-11-29 01:06:42.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:06:43 np0005539279 nova_compute[187514]: 2025-11-29 01:06:43.541 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:43 np0005539279 podman[224385]: 2025-11-29 01:06:43.83879357 +0000 UTC m=+0.079066976 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 20:06:43 np0005539279 podman[224384]: 2025-11-29 01:06:43.866715658 +0000 UTC m=+0.106948933 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:06:46 np0005539279 nova_compute[187514]: 2025-11-29 01:06:46.515 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:48 np0005539279 nova_compute[187514]: 2025-11-29 01:06:48.544 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:51 np0005539279 nova_compute[187514]: 2025-11-29 01:06:51.517 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:53 np0005539279 nova_compute[187514]: 2025-11-29 01:06:53.545 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:56 np0005539279 nova_compute[187514]: 2025-11-29 01:06:56.563 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:06:57 np0005539279 podman[224429]: 2025-11-29 01:06:57.832435943 +0000 UTC m=+0.075296669 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:06:57 np0005539279 podman[224431]: 2025-11-29 01:06:57.832428102 +0000 UTC m=+0.062901561 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:06:57 np0005539279 podman[224430]: 2025-11-29 01:06:57.840144291 +0000 UTC m=+0.082552057 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 20:06:58 np0005539279 nova_compute[187514]: 2025-11-29 01:06:58.547 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:01 np0005539279 nova_compute[187514]: 2025-11-29 01:07:01.597 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:03 np0005539279 nova_compute[187514]: 2025-11-29 01:07:03.549 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:06 np0005539279 nova_compute[187514]: 2025-11-29 01:07:06.634 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:07:08.100 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:07:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:07:08.101 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:07:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:07:08.101 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:07:08 np0005539279 nova_compute[187514]: 2025-11-29 01:07:08.553 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:11 np0005539279 podman[224490]: 2025-11-29 01:07:11.31798775 +0000 UTC m=+0.081671986 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 20:07:11 np0005539279 podman[224491]: 2025-11-29 01:07:11.340014802 +0000 UTC m=+0.088018664 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 20:07:11 np0005539279 nova_compute[187514]: 2025-11-29 01:07:11.636 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:13 np0005539279 nova_compute[187514]: 2025-11-29 01:07:13.556 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:14 np0005539279 podman[224535]: 2025-11-29 01:07:14.853440303 +0000 UTC m=+0.085803212 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:07:14 np0005539279 podman[224534]: 2025-11-29 01:07:14.90239938 +0000 UTC m=+0.141864874 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Nov 28 20:07:16 np0005539279 nova_compute[187514]: 2025-11-29 01:07:16.673 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:18 np0005539279 nova_compute[187514]: 2025-11-29 01:07:18.558 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:21 np0005539279 nova_compute[187514]: 2025-11-29 01:07:21.676 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:23 np0005539279 nova_compute[187514]: 2025-11-29 01:07:23.560 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:26 np0005539279 nova_compute[187514]: 2025-11-29 01:07:26.717 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.562 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.647 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.647 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.648 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.648 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.800 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.801 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5741MB free_disk=73.3365592956543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.801 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.801 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:07:28 np0005539279 podman[224583]: 2025-11-29 01:07:28.824201765 +0000 UTC m=+0.064041777 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:07:28 np0005539279 podman[224584]: 2025-11-29 01:07:28.824187835 +0000 UTC m=+0.064042517 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, container_name=multipathd)
Nov 28 20:07:28 np0005539279 podman[224582]: 2025-11-29 01:07:28.857533939 +0000 UTC m=+0.100993085 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.865 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.866 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.890 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.903 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.904 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:07:28 np0005539279 nova_compute[187514]: 2025-11-29 01:07:28.904 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:07:29 np0005539279 nova_compute[187514]: 2025-11-29 01:07:29.905 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:30 np0005539279 nova_compute[187514]: 2025-11-29 01:07:30.614 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:31 np0005539279 nova_compute[187514]: 2025-11-29 01:07:31.720 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:07:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:07:32 np0005539279 nova_compute[187514]: 2025-11-29 01:07:32.612 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:32 np0005539279 nova_compute[187514]: 2025-11-29 01:07:32.613 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:07:33 np0005539279 nova_compute[187514]: 2025-11-29 01:07:33.564 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:34 np0005539279 nova_compute[187514]: 2025-11-29 01:07:34.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:34 np0005539279 nova_compute[187514]: 2025-11-29 01:07:34.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:07:34 np0005539279 nova_compute[187514]: 2025-11-29 01:07:34.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:07:34 np0005539279 nova_compute[187514]: 2025-11-29 01:07:34.711 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:07:36 np0005539279 nova_compute[187514]: 2025-11-29 01:07:36.757 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:37 np0005539279 nova_compute[187514]: 2025-11-29 01:07:37.707 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:38 np0005539279 nova_compute[187514]: 2025-11-29 01:07:38.573 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:38 np0005539279 nova_compute[187514]: 2025-11-29 01:07:38.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:41 np0005539279 nova_compute[187514]: 2025-11-29 01:07:41.611 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:41 np0005539279 nova_compute[187514]: 2025-11-29 01:07:41.795 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:41 np0005539279 podman[224649]: 2025-11-29 01:07:41.844877602 +0000 UTC m=+0.078334059 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 20:07:41 np0005539279 podman[224648]: 2025-11-29 01:07:41.856967793 +0000 UTC m=+0.093127412 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, managed_by=edpm_ansible)
Nov 28 20:07:43 np0005539279 nova_compute[187514]: 2025-11-29 01:07:43.572 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:44 np0005539279 nova_compute[187514]: 2025-11-29 01:07:44.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:07:45 np0005539279 podman[224699]: 2025-11-29 01:07:45.694925132 +0000 UTC m=+0.089822619 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 20:07:45 np0005539279 podman[224698]: 2025-11-29 01:07:45.746947479 +0000 UTC m=+0.146992573 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 20:07:46 np0005539279 nova_compute[187514]: 2025-11-29 01:07:46.798 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:48 np0005539279 nova_compute[187514]: 2025-11-29 01:07:48.573 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:51 np0005539279 nova_compute[187514]: 2025-11-29 01:07:51.803 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:53 np0005539279 nova_compute[187514]: 2025-11-29 01:07:53.575 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:56 np0005539279 nova_compute[187514]: 2025-11-29 01:07:56.805 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:58 np0005539279 nova_compute[187514]: 2025-11-29 01:07:58.612 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:07:59 np0005539279 podman[224745]: 2025-11-29 01:07:59.861393405 +0000 UTC m=+0.095105660 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 20:07:59 np0005539279 podman[224746]: 2025-11-29 01:07:59.866175182 +0000 UTC m=+0.087539964 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 20:07:59 np0005539279 podman[224747]: 2025-11-29 01:07:59.877574617 +0000 UTC m=+0.091016252 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 20:08:01 np0005539279 nova_compute[187514]: 2025-11-29 01:08:01.808 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:03 np0005539279 nova_compute[187514]: 2025-11-29 01:08:03.616 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:06 np0005539279 nova_compute[187514]: 2025-11-29 01:08:06.810 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:08:08.102 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:08:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:08:08.103 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:08:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:08:08.103 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:08:08 np0005539279 nova_compute[187514]: 2025-11-29 01:08:08.669 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:11 np0005539279 nova_compute[187514]: 2025-11-29 01:08:11.813 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:12 np0005539279 podman[224812]: 2025-11-29 01:08:12.860201807 +0000 UTC m=+0.091941290 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 20:08:12 np0005539279 podman[224811]: 2025-11-29 01:08:12.868974478 +0000 UTC m=+0.107026021 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64)
Nov 28 20:08:13 np0005539279 nova_compute[187514]: 2025-11-29 01:08:13.709 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:15 np0005539279 podman[224856]: 2025-11-29 01:08:15.830874692 +0000 UTC m=+0.077004062 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 28 20:08:16 np0005539279 podman[224875]: 2025-11-29 01:08:16.085312896 +0000 UTC m=+0.218794946 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 20:08:16 np0005539279 nova_compute[187514]: 2025-11-29 01:08:16.817 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:18 np0005539279 nova_compute[187514]: 2025-11-29 01:08:18.711 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:21 np0005539279 nova_compute[187514]: 2025-11-29 01:08:21.819 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:23 np0005539279 nova_compute[187514]: 2025-11-29 01:08:23.713 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:26 np0005539279 nova_compute[187514]: 2025-11-29 01:08:26.822 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:27 np0005539279 nova_compute[187514]: 2025-11-29 01:08:27.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:27 np0005539279 nova_compute[187514]: 2025-11-29 01:08:27.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 20:08:28 np0005539279 nova_compute[187514]: 2025-11-29 01:08:28.715 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.629 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.661 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.662 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.662 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.662 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.806 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.807 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5754MB free_disk=73.3346061706543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.807 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:08:29 np0005539279 nova_compute[187514]: 2025-11-29 01:08:29.807 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:08:30 np0005539279 nova_compute[187514]: 2025-11-29 01:08:30.271 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:08:30 np0005539279 nova_compute[187514]: 2025-11-29 01:08:30.272 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:08:30 np0005539279 nova_compute[187514]: 2025-11-29 01:08:30.403 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:08:30 np0005539279 nova_compute[187514]: 2025-11-29 01:08:30.422 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:08:30 np0005539279 nova_compute[187514]: 2025-11-29 01:08:30.425 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:08:30 np0005539279 nova_compute[187514]: 2025-11-29 01:08:30.426 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:08:30 np0005539279 podman[224902]: 2025-11-29 01:08:30.8280232 +0000 UTC m=+0.066483072 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 20:08:30 np0005539279 podman[224903]: 2025-11-29 01:08:30.829557264 +0000 UTC m=+0.060402488 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd)
Nov 28 20:08:30 np0005539279 podman[224901]: 2025-11-29 01:08:30.84269553 +0000 UTC m=+0.077439525 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 20:08:31 np0005539279 nova_compute[187514]: 2025-11-29 01:08:31.406 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:31 np0005539279 nova_compute[187514]: 2025-11-29 01:08:31.407 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:31 np0005539279 nova_compute[187514]: 2025-11-29 01:08:31.858 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:33 np0005539279 nova_compute[187514]: 2025-11-29 01:08:33.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:33 np0005539279 nova_compute[187514]: 2025-11-29 01:08:33.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:08:33 np0005539279 nova_compute[187514]: 2025-11-29 01:08:33.717 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:34 np0005539279 nova_compute[187514]: 2025-11-29 01:08:34.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:34 np0005539279 nova_compute[187514]: 2025-11-29 01:08:34.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 20:08:34 np0005539279 nova_compute[187514]: 2025-11-29 01:08:34.645 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 20:08:34 np0005539279 nova_compute[187514]: 2025-11-29 01:08:34.645 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:35 np0005539279 nova_compute[187514]: 2025-11-29 01:08:35.655 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:35 np0005539279 nova_compute[187514]: 2025-11-29 01:08:35.655 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:08:35 np0005539279 nova_compute[187514]: 2025-11-29 01:08:35.655 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:08:35 np0005539279 nova_compute[187514]: 2025-11-29 01:08:35.677 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:08:36 np0005539279 nova_compute[187514]: 2025-11-29 01:08:36.861 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:38 np0005539279 nova_compute[187514]: 2025-11-29 01:08:38.760 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:39 np0005539279 nova_compute[187514]: 2025-11-29 01:08:39.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:39 np0005539279 nova_compute[187514]: 2025-11-29 01:08:39.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:41 np0005539279 nova_compute[187514]: 2025-11-29 01:08:41.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:41 np0005539279 nova_compute[187514]: 2025-11-29 01:08:41.686 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:41 np0005539279 nova_compute[187514]: 2025-11-29 01:08:41.863 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:43 np0005539279 nova_compute[187514]: 2025-11-29 01:08:43.761 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:43 np0005539279 podman[224968]: 2025-11-29 01:08:43.817901253 +0000 UTC m=+0.067821460 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 20:08:43 np0005539279 podman[224969]: 2025-11-29 01:08:43.824827271 +0000 UTC m=+0.066074800 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 20:08:44 np0005539279 nova_compute[187514]: 2025-11-29 01:08:44.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:08:46 np0005539279 podman[225013]: 2025-11-29 01:08:46.810307261 +0000 UTC m=+0.051971007 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 20:08:46 np0005539279 podman[225012]: 2025-11-29 01:08:46.838304741 +0000 UTC m=+0.085144265 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 20:08:46 np0005539279 nova_compute[187514]: 2025-11-29 01:08:46.866 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:48 np0005539279 nova_compute[187514]: 2025-11-29 01:08:48.763 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:51 np0005539279 nova_compute[187514]: 2025-11-29 01:08:51.868 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:53 np0005539279 nova_compute[187514]: 2025-11-29 01:08:53.765 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:56 np0005539279 nova_compute[187514]: 2025-11-29 01:08:56.870 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:08:58 np0005539279 nova_compute[187514]: 2025-11-29 01:08:58.766 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:01 np0005539279 podman[225063]: 2025-11-29 01:09:01.824534139 +0000 UTC m=+0.072990907 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 20:09:01 np0005539279 podman[225064]: 2025-11-29 01:09:01.85080284 +0000 UTC m=+0.083337683 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:09:01 np0005539279 podman[225065]: 2025-11-29 01:09:01.865783229 +0000 UTC m=+0.090315443 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 28 20:09:01 np0005539279 nova_compute[187514]: 2025-11-29 01:09:01.872 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:03 np0005539279 nova_compute[187514]: 2025-11-29 01:09:03.825 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:06 np0005539279 nova_compute[187514]: 2025-11-29 01:09:06.874 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:09:08.103 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:09:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:09:08.104 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:09:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:09:08.104 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:09:08 np0005539279 nova_compute[187514]: 2025-11-29 01:09:08.828 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:11 np0005539279 nova_compute[187514]: 2025-11-29 01:09:11.876 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:13 np0005539279 nova_compute[187514]: 2025-11-29 01:09:13.880 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:14 np0005539279 nova_compute[187514]: 2025-11-29 01:09:14.494 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:14 np0005539279 podman[225130]: 2025-11-29 01:09:14.81040467 +0000 UTC m=+0.058066481 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 20:09:14 np0005539279 podman[225129]: 2025-11-29 01:09:14.840573162 +0000 UTC m=+0.088492661 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 28 20:09:16 np0005539279 nova_compute[187514]: 2025-11-29 01:09:16.878 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:17 np0005539279 nova_compute[187514]: 2025-11-29 01:09:17.058 187518 DEBUG oslo_concurrency.processutils [None req-cb86c1c2-681f-4639-babb-bc6c32343eaa 6d0b848e23024d94a6d86cc44c1c50db 6eabfaada87c45439569e038a74b4318 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 20:09:17 np0005539279 nova_compute[187514]: 2025-11-29 01:09:17.101 187518 DEBUG oslo_concurrency.processutils [None req-cb86c1c2-681f-4639-babb-bc6c32343eaa 6d0b848e23024d94a6d86cc44c1c50db 6eabfaada87c45439569e038a74b4318 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 20:09:17 np0005539279 podman[225173]: 2025-11-29 01:09:17.798447952 +0000 UTC m=+0.049273030 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 20:09:17 np0005539279 podman[225172]: 2025-11-29 01:09:17.831405304 +0000 UTC m=+0.083098816 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 20:09:18 np0005539279 nova_compute[187514]: 2025-11-29 01:09:18.883 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:21 np0005539279 nova_compute[187514]: 2025-11-29 01:09:21.880 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:23 np0005539279 nova_compute[187514]: 2025-11-29 01:09:23.294 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:23 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:09:23.294 104584 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:60:af', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:86:ad:42:2d:0e'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 20:09:23 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:09:23.296 104584 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 20:09:23 np0005539279 nova_compute[187514]: 2025-11-29 01:09:23.926 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:26 np0005539279 nova_compute[187514]: 2025-11-29 01:09:26.882 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:28 np0005539279 nova_compute[187514]: 2025-11-29 01:09:28.973 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.654 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.654 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.655 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.655 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.844 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.846 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5744MB free_disk=73.3346061706543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.846 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.846 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.884 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.936 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.937 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:09:31 np0005539279 nova_compute[187514]: 2025-11-29 01:09:31.962 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing inventories for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 20:09:32 np0005539279 nova_compute[187514]: 2025-11-29 01:09:32.060 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating ProviderTree inventory for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 20:09:32 np0005539279 nova_compute[187514]: 2025-11-29 01:09:32.061 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Updating inventory in ProviderTree for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 20:09:32 np0005539279 nova_compute[187514]: 2025-11-29 01:09:32.082 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing aggregate associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 20:09:32 np0005539279 nova_compute[187514]: 2025-11-29 01:09:32.130 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Refreshing trait associations for resource provider 15673c9a-eee0-47b4-b3d3-728a0fedb147, traits: COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_SHA,HW_CPU_X86_AVX2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_USB,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_F16C,COMPUTE_VOLUME_EXTEND,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_FMA3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,HW_CPU_X86_SSE,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_BMI2,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_RESCUE_BFV,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 20:09:32 np0005539279 nova_compute[187514]: 2025-11-29 01:09:32.163 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:09:32 np0005539279 nova_compute[187514]: 2025-11-29 01:09:32.189 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:09:32 np0005539279 nova_compute[187514]: 2025-11-29 01:09:32.192 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:09:32 np0005539279 nova_compute[187514]: 2025-11-29 01:09:32.192 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:09:32 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:09:32.299 104584 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=bb6a090d-c99b-4a6a-9b20-ad4330625b75, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:09:32.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:09:32 np0005539279 podman[225217]: 2025-11-29 01:09:32.861671501 +0000 UTC m=+0.092008981 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 20:09:32 np0005539279 podman[225218]: 2025-11-29 01:09:32.870150474 +0000 UTC m=+0.096104049 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:09:32 np0005539279 podman[225216]: 2025-11-29 01:09:32.87456729 +0000 UTC m=+0.110501500 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Nov 28 20:09:33 np0005539279 nova_compute[187514]: 2025-11-29 01:09:33.192 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:33 np0005539279 nova_compute[187514]: 2025-11-29 01:09:33.998 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:35 np0005539279 nova_compute[187514]: 2025-11-29 01:09:35.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:35 np0005539279 nova_compute[187514]: 2025-11-29 01:09:35.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:09:36 np0005539279 nova_compute[187514]: 2025-11-29 01:09:36.887 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:37 np0005539279 nova_compute[187514]: 2025-11-29 01:09:37.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:37 np0005539279 nova_compute[187514]: 2025-11-29 01:09:37.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:09:37 np0005539279 nova_compute[187514]: 2025-11-29 01:09:37.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:09:37 np0005539279 nova_compute[187514]: 2025-11-29 01:09:37.689 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:09:39 np0005539279 nova_compute[187514]: 2025-11-29 01:09:39.038 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:40 np0005539279 nova_compute[187514]: 2025-11-29 01:09:40.687 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:41 np0005539279 nova_compute[187514]: 2025-11-29 01:09:41.611 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:41 np0005539279 nova_compute[187514]: 2025-11-29 01:09:41.612 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:41 np0005539279 nova_compute[187514]: 2025-11-29 01:09:41.888 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:44 np0005539279 nova_compute[187514]: 2025-11-29 01:09:44.043 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:45 np0005539279 podman[225274]: 2025-11-29 01:09:45.812543622 +0000 UTC m=+0.062324512 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container)
Nov 28 20:09:45 np0005539279 podman[225275]: 2025-11-29 01:09:45.819936854 +0000 UTC m=+0.061991793 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 20:09:46 np0005539279 nova_compute[187514]: 2025-11-29 01:09:46.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:09:46 np0005539279 nova_compute[187514]: 2025-11-29 01:09:46.890 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:48 np0005539279 podman[225320]: 2025-11-29 01:09:48.833714141 +0000 UTC m=+0.072966177 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 20:09:48 np0005539279 podman[225319]: 2025-11-29 01:09:48.914738238 +0000 UTC m=+0.153884251 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 20:09:49 np0005539279 nova_compute[187514]: 2025-11-29 01:09:49.046 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:51 np0005539279 nova_compute[187514]: 2025-11-29 01:09:51.893 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:54 np0005539279 nova_compute[187514]: 2025-11-29 01:09:54.077 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:56 np0005539279 nova_compute[187514]: 2025-11-29 01:09:56.895 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:09:59 np0005539279 nova_compute[187514]: 2025-11-29 01:09:59.141 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:01 np0005539279 nova_compute[187514]: 2025-11-29 01:10:01.897 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:03 np0005539279 podman[225368]: 2025-11-29 01:10:03.81439256 +0000 UTC m=+0.062731275 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 20:10:03 np0005539279 podman[225370]: 2025-11-29 01:10:03.827631278 +0000 UTC m=+0.072663088 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 20:10:03 np0005539279 podman[225369]: 2025-11-29 01:10:03.8434502 +0000 UTC m=+0.086293207 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 20:10:04 np0005539279 nova_compute[187514]: 2025-11-29 01:10:04.144 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:06 np0005539279 nova_compute[187514]: 2025-11-29 01:10:06.899 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:10:08.104 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:10:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:10:08.105 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:10:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:10:08.105 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:10:09 np0005539279 nova_compute[187514]: 2025-11-29 01:10:09.146 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:11 np0005539279 nova_compute[187514]: 2025-11-29 01:10:11.902 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:14 np0005539279 nova_compute[187514]: 2025-11-29 01:10:14.148 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:16 np0005539279 podman[225437]: 2025-11-29 01:10:16.854014943 +0000 UTC m=+0.093496834 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350)
Nov 28 20:10:16 np0005539279 podman[225438]: 2025-11-29 01:10:16.857890264 +0000 UTC m=+0.084477226 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 20:10:16 np0005539279 nova_compute[187514]: 2025-11-29 01:10:16.903 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:19 np0005539279 nova_compute[187514]: 2025-11-29 01:10:19.190 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:19 np0005539279 podman[225484]: 2025-11-29 01:10:19.855999564 +0000 UTC m=+0.076374014 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 28 20:10:19 np0005539279 podman[225483]: 2025-11-29 01:10:19.903884573 +0000 UTC m=+0.128560686 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 20:10:21 np0005539279 nova_compute[187514]: 2025-11-29 01:10:21.920 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:24 np0005539279 nova_compute[187514]: 2025-11-29 01:10:24.192 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:26 np0005539279 nova_compute[187514]: 2025-11-29 01:10:26.924 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:29 np0005539279 nova_compute[187514]: 2025-11-29 01:10:29.194 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:31 np0005539279 nova_compute[187514]: 2025-11-29 01:10:31.953 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.644 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.645 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.645 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.645 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.820 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.821 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5745MB free_disk=73.3346061706543GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.821 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.821 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.908 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.908 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.948 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.964 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.965 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:10:32 np0005539279 nova_compute[187514]: 2025-11-29 01:10:32.965 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:10:33 np0005539279 nova_compute[187514]: 2025-11-29 01:10:33.966 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:33 np0005539279 nova_compute[187514]: 2025-11-29 01:10:33.967 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:34 np0005539279 nova_compute[187514]: 2025-11-29 01:10:34.267 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:34 np0005539279 podman[225531]: 2025-11-29 01:10:34.863298834 +0000 UTC m=+0.086929446 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 20:10:34 np0005539279 podman[225532]: 2025-11-29 01:10:34.881170345 +0000 UTC m=+0.099648240 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd)
Nov 28 20:10:34 np0005539279 podman[225530]: 2025-11-29 01:10:34.894058913 +0000 UTC m=+0.121471194 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 20:10:35 np0005539279 nova_compute[187514]: 2025-11-29 01:10:35.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:35 np0005539279 nova_compute[187514]: 2025-11-29 01:10:35.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:10:36 np0005539279 nova_compute[187514]: 2025-11-29 01:10:36.956 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:39 np0005539279 nova_compute[187514]: 2025-11-29 01:10:39.310 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:39 np0005539279 nova_compute[187514]: 2025-11-29 01:10:39.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:39 np0005539279 nova_compute[187514]: 2025-11-29 01:10:39.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:10:39 np0005539279 nova_compute[187514]: 2025-11-29 01:10:39.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:10:39 np0005539279 nova_compute[187514]: 2025-11-29 01:10:39.760 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:10:41 np0005539279 nova_compute[187514]: 2025-11-29 01:10:41.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:41 np0005539279 nova_compute[187514]: 2025-11-29 01:10:41.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:41 np0005539279 nova_compute[187514]: 2025-11-29 01:10:41.959 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:42 np0005539279 nova_compute[187514]: 2025-11-29 01:10:42.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:44 np0005539279 nova_compute[187514]: 2025-11-29 01:10:44.311 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:45 np0005539279 nova_compute[187514]: 2025-11-29 01:10:45.604 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:46 np0005539279 nova_compute[187514]: 2025-11-29 01:10:46.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:10:46 np0005539279 nova_compute[187514]: 2025-11-29 01:10:46.963 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:47 np0005539279 podman[225596]: 2025-11-29 01:10:47.800423789 +0000 UTC m=+0.050892537 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 20:10:47 np0005539279 podman[225595]: 2025-11-29 01:10:47.846007832 +0000 UTC m=+0.088426449 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git)
Nov 28 20:10:49 np0005539279 nova_compute[187514]: 2025-11-29 01:10:49.341 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:50 np0005539279 podman[225641]: 2025-11-29 01:10:50.848081136 +0000 UTC m=+0.081820631 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 20:10:50 np0005539279 podman[225640]: 2025-11-29 01:10:50.876306212 +0000 UTC m=+0.121921826 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 20:10:51 np0005539279 nova_compute[187514]: 2025-11-29 01:10:51.965 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:54 np0005539279 nova_compute[187514]: 2025-11-29 01:10:54.342 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:56 np0005539279 nova_compute[187514]: 2025-11-29 01:10:56.967 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:10:59 np0005539279 nova_compute[187514]: 2025-11-29 01:10:59.344 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:01 np0005539279 nova_compute[187514]: 2025-11-29 01:11:01.971 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:04 np0005539279 nova_compute[187514]: 2025-11-29 01:11:04.348 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:05 np0005539279 podman[225685]: 2025-11-29 01:11:05.081469049 +0000 UTC m=+0.059569234 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 20:11:05 np0005539279 podman[225686]: 2025-11-29 01:11:05.082141598 +0000 UTC m=+0.060033387 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:11:05 np0005539279 podman[225684]: 2025-11-29 01:11:05.107254546 +0000 UTC m=+0.078910327 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 20:11:06 np0005539279 nova_compute[187514]: 2025-11-29 01:11:06.974 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:11:08.105 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:11:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:11:08.106 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:11:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:11:08.106 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:11:09 np0005539279 nova_compute[187514]: 2025-11-29 01:11:09.348 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:11 np0005539279 nova_compute[187514]: 2025-11-29 01:11:11.978 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:14 np0005539279 nova_compute[187514]: 2025-11-29 01:11:14.406 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:16 np0005539279 nova_compute[187514]: 2025-11-29 01:11:16.981 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:18 np0005539279 podman[225751]: 2025-11-29 01:11:18.810388982 +0000 UTC m=+0.058718479 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 20:11:18 np0005539279 podman[225750]: 2025-11-29 01:11:18.818549596 +0000 UTC m=+0.072791742 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 20:11:19 np0005539279 nova_compute[187514]: 2025-11-29 01:11:19.449 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:21 np0005539279 podman[225795]: 2025-11-29 01:11:21.875058465 +0000 UTC m=+0.128246777 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 20:11:21 np0005539279 podman[225796]: 2025-11-29 01:11:21.888553111 +0000 UTC m=+0.128212356 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:11:21 np0005539279 nova_compute[187514]: 2025-11-29 01:11:21.982 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:24 np0005539279 nova_compute[187514]: 2025-11-29 01:11:24.451 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:26 np0005539279 nova_compute[187514]: 2025-11-29 01:11:26.984 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:29 np0005539279 nova_compute[187514]: 2025-11-29 01:11:29.452 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:31 np0005539279 nova_compute[187514]: 2025-11-29 01:11:31.987 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:11:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:11:33 np0005539279 nova_compute[187514]: 2025-11-29 01:11:33.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.454 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.647 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.648 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.648 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.648 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.878 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.879 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5751MB free_disk=73.339599609375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.880 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:11:34 np0005539279 nova_compute[187514]: 2025-11-29 01:11:34.880 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:11:35 np0005539279 nova_compute[187514]: 2025-11-29 01:11:35.186 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:11:35 np0005539279 nova_compute[187514]: 2025-11-29 01:11:35.187 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:11:35 np0005539279 nova_compute[187514]: 2025-11-29 01:11:35.215 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:11:35 np0005539279 nova_compute[187514]: 2025-11-29 01:11:35.232 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:11:35 np0005539279 nova_compute[187514]: 2025-11-29 01:11:35.235 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:11:35 np0005539279 nova_compute[187514]: 2025-11-29 01:11:35.236 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:11:35 np0005539279 podman[225845]: 2025-11-29 01:11:35.857249068 +0000 UTC m=+0.084121876 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:11:35 np0005539279 podman[225846]: 2025-11-29 01:11:35.882409967 +0000 UTC m=+0.102967604 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:11:35 np0005539279 podman[225844]: 2025-11-29 01:11:35.883668383 +0000 UTC m=+0.116033068 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Nov 28 20:11:36 np0005539279 nova_compute[187514]: 2025-11-29 01:11:36.237 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:36 np0005539279 nova_compute[187514]: 2025-11-29 01:11:36.990 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:37 np0005539279 nova_compute[187514]: 2025-11-29 01:11:37.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:37 np0005539279 nova_compute[187514]: 2025-11-29 01:11:37.610 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:11:39 np0005539279 nova_compute[187514]: 2025-11-29 01:11:39.455 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:41 np0005539279 nova_compute[187514]: 2025-11-29 01:11:41.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:41 np0005539279 nova_compute[187514]: 2025-11-29 01:11:41.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:11:41 np0005539279 nova_compute[187514]: 2025-11-29 01:11:41.611 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:11:41 np0005539279 nova_compute[187514]: 2025-11-29 01:11:41.637 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:11:41 np0005539279 nova_compute[187514]: 2025-11-29 01:11:41.638 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:41 np0005539279 nova_compute[187514]: 2025-11-29 01:11:41.993 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:42 np0005539279 nova_compute[187514]: 2025-11-29 01:11:42.633 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:43 np0005539279 nova_compute[187514]: 2025-11-29 01:11:43.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:44 np0005539279 nova_compute[187514]: 2025-11-29 01:11:44.458 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:46 np0005539279 nova_compute[187514]: 2025-11-29 01:11:46.997 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:47 np0005539279 nova_compute[187514]: 2025-11-29 01:11:47.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:11:49 np0005539279 nova_compute[187514]: 2025-11-29 01:11:49.461 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:49 np0005539279 podman[225909]: 2025-11-29 01:11:49.841623744 +0000 UTC m=+0.079258647 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Nov 28 20:11:49 np0005539279 podman[225910]: 2025-11-29 01:11:49.852923417 +0000 UTC m=+0.080291646 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 20:11:52 np0005539279 nova_compute[187514]: 2025-11-29 01:11:52.002 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:52 np0005539279 podman[225953]: 2025-11-29 01:11:52.846319582 +0000 UTC m=+0.084248269 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 20:11:52 np0005539279 podman[225952]: 2025-11-29 01:11:52.897556147 +0000 UTC m=+0.140988341 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 20:11:54 np0005539279 nova_compute[187514]: 2025-11-29 01:11:54.516 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:57 np0005539279 nova_compute[187514]: 2025-11-29 01:11:57.006 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:11:59 np0005539279 nova_compute[187514]: 2025-11-29 01:11:59.517 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:02 np0005539279 nova_compute[187514]: 2025-11-29 01:12:02.009 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:04 np0005539279 nova_compute[187514]: 2025-11-29 01:12:04.518 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:06 np0005539279 podman[226002]: 2025-11-29 01:12:06.830767073 +0000 UTC m=+0.071863514 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 20:12:06 np0005539279 podman[226000]: 2025-11-29 01:12:06.832270396 +0000 UTC m=+0.073809219 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 20:12:06 np0005539279 podman[226001]: 2025-11-29 01:12:06.84149871 +0000 UTC m=+0.074329595 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:12:07 np0005539279 nova_compute[187514]: 2025-11-29 01:12:07.012 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:12:08.107 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:12:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:12:08.108 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:12:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:12:08.108 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:12:09 np0005539279 nova_compute[187514]: 2025-11-29 01:12:09.520 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:12 np0005539279 nova_compute[187514]: 2025-11-29 01:12:12.017 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:14 np0005539279 nova_compute[187514]: 2025-11-29 01:12:14.523 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:17 np0005539279 nova_compute[187514]: 2025-11-29 01:12:17.021 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:19 np0005539279 nova_compute[187514]: 2025-11-29 01:12:19.557 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:20 np0005539279 podman[226060]: 2025-11-29 01:12:20.820132344 +0000 UTC m=+0.063054713 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 20:12:20 np0005539279 podman[226061]: 2025-11-29 01:12:20.836217473 +0000 UTC m=+0.067331534 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 20:12:22 np0005539279 nova_compute[187514]: 2025-11-29 01:12:22.023 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:23 np0005539279 podman[226104]: 2025-11-29 01:12:23.855840916 +0000 UTC m=+0.087048638 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 20:12:23 np0005539279 podman[226103]: 2025-11-29 01:12:23.903168728 +0000 UTC m=+0.143086309 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller)
Nov 28 20:12:24 np0005539279 nova_compute[187514]: 2025-11-29 01:12:24.635 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:27 np0005539279 nova_compute[187514]: 2025-11-29 01:12:27.075 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:29 np0005539279 nova_compute[187514]: 2025-11-29 01:12:29.662 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:32 np0005539279 nova_compute[187514]: 2025-11-29 01:12:32.119 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:33 np0005539279 nova_compute[187514]: 2025-11-29 01:12:33.610 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:34 np0005539279 nova_compute[187514]: 2025-11-29 01:12:34.665 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.642 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.643 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.643 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.644 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.876 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.878 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5744MB free_disk=73.33965682983398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.879 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.879 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.978 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:12:35 np0005539279 nova_compute[187514]: 2025-11-29 01:12:35.979 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:12:36 np0005539279 nova_compute[187514]: 2025-11-29 01:12:36.012 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:12:36 np0005539279 nova_compute[187514]: 2025-11-29 01:12:36.032 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:12:36 np0005539279 nova_compute[187514]: 2025-11-29 01:12:36.035 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:12:36 np0005539279 nova_compute[187514]: 2025-11-29 01:12:36.036 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:12:37 np0005539279 nova_compute[187514]: 2025-11-29 01:12:37.123 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:37 np0005539279 podman[226147]: 2025-11-29 01:12:37.881827012 +0000 UTC m=+0.107159902 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:12:37 np0005539279 podman[226148]: 2025-11-29 01:12:37.884427876 +0000 UTC m=+0.103650062 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 20:12:37 np0005539279 podman[226149]: 2025-11-29 01:12:37.897923662 +0000 UTC m=+0.107543024 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 20:12:38 np0005539279 nova_compute[187514]: 2025-11-29 01:12:38.037 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:38 np0005539279 nova_compute[187514]: 2025-11-29 01:12:38.038 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:12:39 np0005539279 nova_compute[187514]: 2025-11-29 01:12:39.667 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:42 np0005539279 nova_compute[187514]: 2025-11-29 01:12:42.127 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:43 np0005539279 nova_compute[187514]: 2025-11-29 01:12:43.605 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:43 np0005539279 nova_compute[187514]: 2025-11-29 01:12:43.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:43 np0005539279 nova_compute[187514]: 2025-11-29 01:12:43.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 20:12:43 np0005539279 nova_compute[187514]: 2025-11-29 01:12:43.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 20:12:43 np0005539279 nova_compute[187514]: 2025-11-29 01:12:43.635 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 20:12:43 np0005539279 nova_compute[187514]: 2025-11-29 01:12:43.636 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:44 np0005539279 nova_compute[187514]: 2025-11-29 01:12:44.669 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:45 np0005539279 nova_compute[187514]: 2025-11-29 01:12:45.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:47 np0005539279 nova_compute[187514]: 2025-11-29 01:12:47.132 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:47 np0005539279 nova_compute[187514]: 2025-11-29 01:12:47.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:49 np0005539279 nova_compute[187514]: 2025-11-29 01:12:49.604 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:12:49 np0005539279 nova_compute[187514]: 2025-11-29 01:12:49.716 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:51 np0005539279 podman[226210]: 2025-11-29 01:12:51.815922754 +0000 UTC m=+0.065074131 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350)
Nov 28 20:12:51 np0005539279 podman[226211]: 2025-11-29 01:12:51.847915907 +0000 UTC m=+0.092630667 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 20:12:52 np0005539279 nova_compute[187514]: 2025-11-29 01:12:52.135 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:54 np0005539279 nova_compute[187514]: 2025-11-29 01:12:54.717 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:54 np0005539279 podman[226255]: 2025-11-29 01:12:54.839685115 +0000 UTC m=+0.073098619 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:12:54 np0005539279 podman[226254]: 2025-11-29 01:12:54.868819257 +0000 UTC m=+0.111265929 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 20:12:57 np0005539279 nova_compute[187514]: 2025-11-29 01:12:57.138 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:12:59 np0005539279 nova_compute[187514]: 2025-11-29 01:12:59.720 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:02 np0005539279 nova_compute[187514]: 2025-11-29 01:13:02.143 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:04 np0005539279 nova_compute[187514]: 2025-11-29 01:13:04.723 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:07 np0005539279 nova_compute[187514]: 2025-11-29 01:13:07.147 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:13:08.109 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:13:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:13:08.109 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:13:08 np0005539279 ovn_metadata_agent[104579]: 2025-11-29 01:13:08.110 104584 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:13:08 np0005539279 podman[226300]: 2025-11-29 01:13:08.814683035 +0000 UTC m=+0.062463295 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 20:13:08 np0005539279 podman[226301]: 2025-11-29 01:13:08.817548327 +0000 UTC m=+0.064038921 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 20:13:08 np0005539279 podman[226299]: 2025-11-29 01:13:08.826304727 +0000 UTC m=+0.066294135 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 28 20:13:09 np0005539279 nova_compute[187514]: 2025-11-29 01:13:09.726 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:12 np0005539279 nova_compute[187514]: 2025-11-29 01:13:12.151 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:14 np0005539279 nova_compute[187514]: 2025-11-29 01:13:14.727 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:17 np0005539279 nova_compute[187514]: 2025-11-29 01:13:17.154 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:19 np0005539279 nova_compute[187514]: 2025-11-29 01:13:19.729 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:22 np0005539279 nova_compute[187514]: 2025-11-29 01:13:22.159 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:22 np0005539279 podman[226363]: 2025-11-29 01:13:22.842086863 +0000 UTC m=+0.083388074 container health_status 31565597ea82b2e2d72a7dffddbfd1b33c0aa8f0ccaedc2371f35f28924eb352 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git)
Nov 28 20:13:22 np0005539279 podman[226364]: 2025-11-29 01:13:22.850326848 +0000 UTC m=+0.090509447 container health_status b19b156e37851efe2de6f3875043fc9aae8629ee9afed74bd647e93d49e4653b (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 20:13:24 np0005539279 nova_compute[187514]: 2025-11-29 01:13:24.732 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:25 np0005539279 podman[226409]: 2025-11-29 01:13:25.821704442 +0000 UTC m=+0.067485048 container health_status dc99827be24359095f66792d91174ce23450e1fa03b27b58219ebbfce11c72da (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 20:13:25 np0005539279 podman[226408]: 2025-11-29 01:13:25.860216043 +0000 UTC m=+0.102748417 container health_status 0de07ddf48f97d4b2a5863e43e40e7146dcabf3941f00d1849fa4aeceff19d4f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 20:13:27 np0005539279 nova_compute[187514]: 2025-11-29 01:13:27.163 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:29 np0005539279 nova_compute[187514]: 2025-11-29 01:13:29.735 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:31 np0005539279 systemd-logind[811]: New session 30 of user zuul.
Nov 28 20:13:31 np0005539279 systemd[1]: Started Session 30 of User zuul.
Nov 28 20:13:32 np0005539279 nova_compute[187514]: 2025-11-29 01:13:32.165 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:32 np0005539279 ceilometer_agent_compute[198266]: 2025-11-29 01:13:32.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 20:13:34 np0005539279 nova_compute[187514]: 2025-11-29 01:13:34.737 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:35 np0005539279 nova_compute[187514]: 2025-11-29 01:13:35.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:13:36 np0005539279 ovs-vsctl[226629]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 28 20:13:36 np0005539279 nova_compute[187514]: 2025-11-29 01:13:36.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:13:36 np0005539279 nova_compute[187514]: 2025-11-29 01:13:36.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:13:36 np0005539279 nova_compute[187514]: 2025-11-29 01:13:36.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 20:13:36 np0005539279 nova_compute[187514]: 2025-11-29 01:13:36.640 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 20:13:37 np0005539279 virtqemud[187089]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.168 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:37 np0005539279 virtqemud[187089]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 28 20:13:37 np0005539279 virtqemud[187089]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.609 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.786 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.788 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.788 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.788 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.962 187518 WARNING nova.virt.libvirt.driver [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.965 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5659MB free_disk=73.3149642944336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.966 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 20:13:37 np0005539279 nova_compute[187514]: 2025-11-29 01:13:37.966 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 20:13:38 np0005539279 nova_compute[187514]: 2025-11-29 01:13:38.116 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 20:13:38 np0005539279 nova_compute[187514]: 2025-11-29 01:13:38.117 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 20:13:38 np0005539279 nova_compute[187514]: 2025-11-29 01:13:38.180 187518 DEBUG nova.compute.provider_tree [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed in ProviderTree for provider: 15673c9a-eee0-47b4-b3d3-728a0fedb147 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 20:13:38 np0005539279 nova_compute[187514]: 2025-11-29 01:13:38.195 187518 DEBUG nova.scheduler.client.report [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Inventory has not changed for provider 15673c9a-eee0-47b4-b3d3-728a0fedb147 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 20:13:38 np0005539279 nova_compute[187514]: 2025-11-29 01:13:38.196 187518 DEBUG nova.compute.resource_tracker [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 20:13:38 np0005539279 nova_compute[187514]: 2025-11-29 01:13:38.197 187518 DEBUG oslo_concurrency.lockutils [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 20:13:38 np0005539279 nova_compute[187514]: 2025-11-29 01:13:38.197 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:13:39 np0005539279 nova_compute[187514]: 2025-11-29 01:13:39.799 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 20:13:39 np0005539279 podman[227118]: 2025-11-29 01:13:39.871225111 +0000 UTC m=+0.071808882 container health_status 120c7ecfa595d86226fd1c743893cf74340c2519b011d2c8813de7bbc69dd8a0 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Nov 28 20:13:39 np0005539279 podman[227125]: 2025-11-29 01:13:39.889100932 +0000 UTC m=+0.071147134 container health_status b6c0ec6b95ae96ff1856433c89f87399c10ea0794d76613ebe95b09ecc1e1eb0 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 20:13:39 np0005539279 podman[227124]: 2025-11-29 01:13:39.90722961 +0000 UTC m=+0.089684084 container health_status 5b80308ff7b94f87448cbb44a7eefdd3602e8b4ca39f741bcda4b5d14be4449d (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 20:13:40 np0005539279 nova_compute[187514]: 2025-11-29 01:13:40.244 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:13:40 np0005539279 nova_compute[187514]: 2025-11-29 01:13:40.245 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 20:13:40 np0005539279 nova_compute[187514]: 2025-11-29 01:13:40.608 187518 DEBUG oslo_service.periodic_task [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 20:13:40 np0005539279 nova_compute[187514]: 2025-11-29 01:13:40.609 187518 DEBUG nova.compute.manager [None req-655c8493-9e65-4fa4-b5f3-0920a2ebfdf3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 20:13:40 np0005539279 systemd[1]: Starting Hostname Service...
Nov 28 20:13:40 np0005539279 systemd[1]: Started Hostname Service.
Nov 28 20:13:42 np0005539279 nova_compute[187514]: 2025-11-29 01:13:42.171 187518 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
